Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Ethical Considerations in Research | Types & Examples

Ethical Considerations in Research | Types & Examples

Published on October 18, 2021 by Pritha Bhandari . Revised on May 9, 2024.

Ethical considerations in research are a set of principles that guide your research designs and practices. Scientists and researchers must always adhere to a certain code of conduct when collecting data from people.

The goals of human research often include understanding real-life phenomena, studying effective treatments, investigating behaviors, and improving lives in other ways. What you decide to research and how you conduct that research involve key ethical considerations.

These considerations work to

  • protect the rights of research participants
  • enhance research validity
  • maintain scientific or academic integrity

Table of contents

Why do research ethics matter, getting ethical approval for your study, types of ethical issues, voluntary participation, informed consent, confidentiality, potential for harm, results communication, examples of ethical failures, other interesting articles, frequently asked questions about research ethics.

Research ethics matter for scientific integrity, human rights and dignity, and collaboration between science and society. These principles make sure that participation in studies is voluntary, informed, and safe for research subjects.

You’ll balance pursuing important research objectives with using ethical research methods and procedures. It’s always necessary to prevent permanent or excessive harm to participants, whether inadvertent or not.

Defying research ethics will also lower the credibility of your research because it’s hard for others to trust your data if your methods are morally questionable.

Even if a research idea is valuable to society, it doesn’t justify violating the human rights or dignity of your study participants.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Before you start any study involving data collection with people, you’ll submit your research proposal to an institutional review board (IRB) .

An IRB is a committee that checks whether your research aims and research design are ethically acceptable and follow your institution’s code of conduct. They check that your research materials and procedures are up to code.

If successful, you’ll receive IRB approval, and you can begin collecting data according to the approved procedures. If you want to make any changes to your procedures or materials, you’ll need to submit a modification application to the IRB for approval.

If unsuccessful, you may be asked to re-submit with modifications or your research proposal may receive a rejection. To get IRB approval, it’s important to explicitly note how you’ll tackle each of the ethical issues that may arise in your study.

There are several ethical issues you should always pay attention to in your research design, and these issues can overlap with each other.

You’ll usually outline ways you’ll deal with each issue in your research proposal if you plan to collect data from participants.

Voluntary participation Your participants are free to opt in or out of the study at any point in time.
Informed consent Participants know the purpose, benefits, risks, and funding behind the study before they agree or decline to join.
Anonymity You don’t know the identities of the participants. Personally identifiable data is not collected.
Confidentiality You know who the participants are but you keep that information hidden from everyone else. You anonymize personally identifiable data so that it can’t be linked to other data by anyone else.
Potential for harm Physical, social, psychological and all other types of harm are kept to an absolute minimum.
Results communication You ensure your work is free of or research misconduct, and you accurately represent your results.

Voluntary participation means that all research subjects are free to choose to participate without any pressure or coercion.

All participants are able to withdraw from, or leave, the study at any point without feeling an obligation to continue. Your participants don’t need to provide a reason for leaving the study.

It’s important to make it clear to participants that there are no negative consequences or repercussions to their refusal to participate. After all, they’re taking the time to help you in the research process , so you should respect their decisions without trying to change their minds.

Voluntary participation is an ethical principle protected by international law and many scientific codes of conduct.

Take special care to ensure there’s no pressure on participants when you’re working with vulnerable groups of people who may find it hard to stop the study even when they want to.

Informed consent refers to a situation in which all potential participants receive and understand all the information they need to decide whether they want to participate. This includes information about the study’s benefits, risks, funding, and institutional approval.

You make sure to provide all potential participants with all the relevant information about

  • what the study is about
  • the risks and benefits of taking part
  • how long the study will take
  • your supervisor’s contact information and the institution’s approval number

Usually, you’ll provide participants with a text for them to read and ask them if they have any questions. If they agree to participate, they can sign or initial the consent form. Note that this may not be sufficient for informed consent when you work with particularly vulnerable groups of people.

If you’re collecting data from people with low literacy, make sure to verbally explain the consent form to them before they agree to participate.

For participants with very limited English proficiency, you should always translate the study materials or work with an interpreter so they have all the information in their first language.

In research with children, you’ll often need informed permission for their participation from their parents or guardians. Although children cannot give informed consent, it’s best to also ask for their assent (agreement) to participate, depending on their age and maturity level.

Anonymity means that you don’t know who the participants are and you can’t link any individual participant to their data.

You can only guarantee anonymity by not collecting any personally identifying information—for example, names, phone numbers, email addresses, IP addresses, physical characteristics, photos, and videos.

In many cases, it may be impossible to truly anonymize data collection . For example, data collected in person or by phone cannot be considered fully anonymous because some personal identifiers (demographic information or phone numbers) are impossible to hide.

You’ll also need to collect some identifying information if you give your participants the option to withdraw their data at a later stage.

Data pseudonymization is an alternative method where you replace identifying information about participants with pseudonymous, or fake, identifiers. The data can still be linked to participants but it’s harder to do so because you separate personal information from the study data.

Confidentiality means that you know who the participants are, but you remove all identifying information from your report.

All participants have a right to privacy, so you should protect their personal data for as long as you store or use it. Even when you can’t collect data anonymously, you should secure confidentiality whenever you can.

Some research designs aren’t conducive to confidentiality, but it’s important to make all attempts and inform participants of the risks involved.

As a researcher, you have to consider all possible sources of harm to participants. Harm can come in many different forms.

  • Psychological harm: Sensitive questions or tasks may trigger negative emotions such as shame or anxiety.
  • Social harm: Participation can involve social risks, public embarrassment, or stigma.
  • Physical harm: Pain or injury can result from the study procedures.
  • Legal harm: Reporting sensitive data could lead to legal risks or a breach of privacy.

It’s best to consider every possible source of harm in your study as well as concrete ways to mitigate them. Involve your supervisor to discuss steps for harm reduction.

Make sure to disclose all possible risks of harm to participants before the study to get informed consent. If there is a risk of harm, prepare to provide participants with resources or counseling or medical services if needed.

Some of these questions may bring up negative emotions, so you inform participants about the sensitive nature of the survey and assure them that their responses will be confidential.

The way you communicate your research results can sometimes involve ethical issues. Good science communication is honest, reliable, and credible. It’s best to make your results as transparent as possible.

Take steps to actively avoid plagiarism and research misconduct wherever possible.

Plagiarism means submitting others’ works as your own. Although it can be unintentional, copying someone else’s work without proper credit amounts to stealing. It’s an ethical problem in research communication because you may benefit by harming other researchers.

Self-plagiarism is when you republish or re-submit parts of your own papers or reports without properly citing your original work.

This is problematic because you may benefit from presenting your ideas as new and original even though they’ve already been published elsewhere in the past. You may also be infringing on your previous publisher’s copyright, violating an ethical code, or wasting time and resources by doing so.

In extreme cases of self-plagiarism, entire datasets or papers are sometimes duplicated. These are major ethical violations because they can skew research findings if taken as original data.

You notice that two published studies have similar characteristics even though they are from different years. Their sample sizes, locations, treatments, and results are highly similar, and the studies share one author in common.

Research misconduct

Research misconduct means making up or falsifying data, manipulating data analyses, or misrepresenting results in research reports. It’s a form of academic fraud.

These actions are committed intentionally and can have serious consequences; research misconduct is not a simple mistake or a point of disagreement about data analyses.

Research misconduct is a serious ethical issue because it can undermine academic integrity and institutional credibility. It leads to a waste of funding and resources that could have been used for alternative research.

Later investigations revealed that they fabricated and manipulated their data to show a nonexistent link between vaccines and autism. Wakefield also neglected to disclose important conflicts of interest, and his medical license was taken away.

This fraudulent work sparked vaccine hesitancy among parents and caregivers. The rate of MMR vaccinations in children fell sharply, and measles outbreaks became more common due to a lack of herd immunity.

Research scandals with ethical failures are littered throughout history, but some took place not that long ago.

Some scientists in positions of power have historically mistreated or even abused research participants to investigate research problems at any cost. These participants were prisoners, under their care, or otherwise trusted them to treat them with dignity.

To demonstrate the importance of research ethics, we’ll briefly review two research studies that violated human rights in modern history.

These experiments were inhumane and resulted in trauma, permanent disabilities, or death in many cases.

After some Nazi doctors were put on trial for their crimes, the Nuremberg Code of research ethics for human experimentation was developed in 1947 to establish a new standard for human experimentation in medical research.

In reality, the actual goal was to study the effects of the disease when left untreated, and the researchers never informed participants about their diagnoses or the research aims.

Although participants experienced severe health problems, including blindness and other complications, the researchers only pretended to provide medical care.

When treatment became possible in 1943, 11 years after the study began, none of the participants were offered it, despite their health conditions and high risk of death.

Ethical failures like these resulted in severe harm to participants, wasted resources, and lower trust in science and scientists. This is why all research institutions have strict ethical guidelines for performing research.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Normal distribution
  • Measures of central tendency
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Thematic analysis
  • Cohort study
  • Peer review
  • Ethnography

Research bias

  • Implicit bias
  • Cognitive bias
  • Conformity bias
  • Hawthorne effect
  • Availability heuristic
  • Attrition bias
  • Social desirability bias

Ethical considerations in research are a set of principles that guide your research designs and practices. These principles include voluntary participation, informed consent, anonymity, confidentiality, potential for harm, and results communication.

Scientists and researchers must always adhere to a certain code of conduct when collecting data from others .

These considerations protect the rights of research participants, enhance research validity , and maintain scientific integrity.

Research ethics matter for scientific integrity, human rights and dignity, and collaboration between science and society. These principles make sure that participation in studies is voluntary, informed, and safe.

Anonymity means you don’t know who the participants are, while confidentiality means you know who they are but remove identifying information from your research report. Both are important ethical considerations .

You can only guarantee anonymity by not collecting any personally identifying information—for example, names, phone numbers, email addresses, IP addresses, physical characteristics, photos, or videos.

You can keep data confidential by using aggregate information in your research report, so that you only refer to groups of participants rather than individuals.

These actions are committed intentionally and can have serious consequences; research misconduct is not a simple mistake or a point of disagreement but a serious ethical failure.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2024, May 09). Ethical Considerations in Research | Types & Examples. Scribbr. Retrieved August 21, 2024, from https://www.scribbr.com/methodology/research-ethics/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, data collection | definition, methods & examples, what is self-plagiarism | definition & how to avoid it, how to avoid plagiarism | tips on citing sources, what is your plagiarism score.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

National Institute of Environmental Health Sciences

Your environment. your health., what is ethics in research & why is it important, by david b. resnik, j.d., ph.d..

December 23, 2020

The ideas and opinions expressed in this essay are the author’s own and do not necessarily represent those of the NIH, NIEHS, or US government.

ethic image decorative header

When most people think of ethics (or morals), they think of rules for distinguishing between right and wrong, such as the Golden Rule ("Do unto others as you would have them do unto you"), a code of professional conduct like the Hippocratic Oath ("First of all, do no harm"), a religious creed like the Ten Commandments ("Thou Shalt not kill..."), or a wise aphorisms like the sayings of Confucius. This is the most common way of defining "ethics": norms for conduct that distinguish between acceptable and unacceptable behavior.

Most people learn ethical norms at home, at school, in church, or in other social settings. Although most people acquire their sense of right and wrong during childhood, moral development occurs throughout life and human beings pass through different stages of growth as they mature. Ethical norms are so ubiquitous that one might be tempted to regard them as simple commonsense. On the other hand, if morality were nothing more than commonsense, then why are there so many ethical disputes and issues in our society?

Alternatives to Animal Testing

test tubes on a tray decorrative image

Alternative test methods are methods that replace, reduce, or refine animal use in research and testing

Learn more about Environmental science Basics

One plausible explanation of these disagreements is that all people recognize some common ethical norms but interpret, apply, and balance them in different ways in light of their own values and life experiences. For example, two people could agree that murder is wrong but disagree about the morality of abortion because they have different understandings of what it means to be a human being.

Most societies also have legal rules that govern behavior, but ethical norms tend to be broader and more informal than laws. Although most societies use laws to enforce widely accepted moral standards and ethical and legal rules use similar concepts, ethics and law are not the same. An action may be legal but unethical or illegal but ethical. We can also use ethical concepts and principles to criticize, evaluate, propose, or interpret laws. Indeed, in the last century, many social reformers have urged citizens to disobey laws they regarded as immoral or unjust laws. Peaceful civil disobedience is an ethical way of protesting laws or expressing political viewpoints.

Another way of defining 'ethics' focuses on the disciplines that study standards of conduct, such as philosophy, theology, law, psychology, or sociology. For example, a "medical ethicist" is someone who studies ethical standards in medicine. One may also define ethics as a method, procedure, or perspective for deciding how to act and for analyzing complex problems and issues. For instance, in considering a complex issue like global warming , one may take an economic, ecological, political, or ethical perspective on the problem. While an economist might examine the cost and benefits of various policies related to global warming, an environmental ethicist could examine the ethical values and principles at stake.

See ethics in practice at NIEHS

Read latest updates in our monthly  Global Environmental Health Newsletter

global environmental health

Many different disciplines, institutions , and professions have standards for behavior that suit their particular aims and goals. These standards also help members of the discipline to coordinate their actions or activities and to establish the public's trust of the discipline. For instance, ethical standards govern conduct in medicine, law, engineering, and business. Ethical norms also serve the aims or goals of research and apply to people who conduct scientific research or other scholarly or creative activities. There is even a specialized discipline, research ethics, which studies these norms. See Glossary of Commonly Used Terms in Research Ethics and Research Ethics Timeline .

There are several reasons why it is important to adhere to ethical norms in research. First, norms promote the aims of research , such as knowledge, truth, and avoidance of error. For example, prohibitions against fabricating , falsifying, or misrepresenting research data promote the truth and minimize error.

Join an NIEHS Study

See how we put research Ethics to practice.

Visit Joinastudy.niehs.nih.gov to see the various studies NIEHS perform.

join a study decorative image

Second, since research often involves a great deal of cooperation and coordination among many different people in different disciplines and institutions, ethical standards promote the values that are essential to collaborative work , such as trust, accountability, mutual respect, and fairness. For example, many ethical norms in research, such as guidelines for authorship , copyright and patenting policies , data sharing policies, and confidentiality rules in peer review, are designed to protect intellectual property interests while encouraging collaboration. Most researchers want to receive credit for their contributions and do not want to have their ideas stolen or disclosed prematurely.

Third, many of the ethical norms help to ensure that researchers can be held accountable to the public . For instance, federal policies on research misconduct, conflicts of interest, the human subjects protections, and animal care and use are necessary in order to make sure that researchers who are funded by public money can be held accountable to the public.

Fourth, ethical norms in research also help to build public support for research. People are more likely to fund a research project if they can trust the quality and integrity of research.

Finally, many of the norms of research promote a variety of other important moral and social values , such as social responsibility, human rights, animal welfare, compliance with the law, and public health and safety. Ethical lapses in research can significantly harm human and animal subjects, students, and the public. For example, a researcher who fabricates data in a clinical trial may harm or even kill patients, and a researcher who fails to abide by regulations and guidelines relating to radiation or biological safety may jeopardize his health and safety or the health and safety of staff and students.

Codes and Policies for Research Ethics

Given the importance of ethics for the conduct of research, it should come as no surprise that many different professional associations, government agencies, and universities have adopted specific codes, rules, and policies relating to research ethics. Many government agencies have ethics rules for funded researchers.

  • National Institutes of Health (NIH)
  • National Science Foundation (NSF)
  • Food and Drug Administration (FDA)
  • Environmental Protection Agency (EPA)
  • US Department of Agriculture (USDA)
  • Singapore Statement on Research Integrity
  • American Chemical Society, The Chemist Professional’s Code of Conduct
  • Code of Ethics (American Society for Clinical Laboratory Science)
  • American Psychological Association, Ethical Principles of Psychologists and Code of Conduct
  • Statement on Professional Ethics (American Association of University Professors)
  • Nuremberg Code
  • World Medical Association's Declaration of Helsinki

Ethical Principles

The following is a rough and general summary of some ethical principles that various codes address*:

research ethical problems

Strive for honesty in all scientific communications. Honestly report data, results, methods and procedures, and publication status. Do not fabricate, falsify, or misrepresent data. Do not deceive colleagues, research sponsors, or the public.

research ethical problems

Objectivity

Strive to avoid bias in experimental design, data analysis, data interpretation, peer review, personnel decisions, grant writing, expert testimony, and other aspects of research where objectivity is expected or required. Avoid or minimize bias or self-deception. Disclose personal or financial interests that may affect research.

research ethical problems

Keep your promises and agreements; act with sincerity; strive for consistency of thought and action.

research ethical problems

Carefulness

Avoid careless errors and negligence; carefully and critically examine your own work and the work of your peers. Keep good records of research activities, such as data collection, research design, and correspondence with agencies or journals.

research ethical problems

Share data, results, ideas, tools, resources. Be open to criticism and new ideas.

research ethical problems

Transparency

Disclose methods, materials, assumptions, analyses, and other information needed to evaluate your research.

research ethical problems

Accountability

Take responsibility for your part in research and be prepared to give an account (i.e. an explanation or justification) of what you did on a research project and why.

research ethical problems

Intellectual Property

Honor patents, copyrights, and other forms of intellectual property. Do not use unpublished data, methods, or results without permission. Give proper acknowledgement or credit for all contributions to research. Never plagiarize.

research ethical problems

Confidentiality

Protect confidential communications, such as papers or grants submitted for publication, personnel records, trade or military secrets, and patient records.

research ethical problems

Responsible Publication

Publish in order to advance research and scholarship, not to advance just your own career. Avoid wasteful and duplicative publication.

research ethical problems

Responsible Mentoring

Help to educate, mentor, and advise students. Promote their welfare and allow them to make their own decisions.

research ethical problems

Respect for Colleagues

Respect your colleagues and treat them fairly.

research ethical problems

Social Responsibility

Strive to promote social good and prevent or mitigate social harms through research, public education, and advocacy.

research ethical problems

Non-Discrimination

Avoid discrimination against colleagues or students on the basis of sex, race, ethnicity, or other factors not related to scientific competence and integrity.

research ethical problems

Maintain and improve your own professional competence and expertise through lifelong education and learning; take steps to promote competence in science as a whole.

research ethical problems

Know and obey relevant laws and institutional and governmental policies.

research ethical problems

Animal Care

Show proper respect and care for animals when using them in research. Do not conduct unnecessary or poorly designed animal experiments.

research ethical problems

Human Subjects protection

When conducting research on human subjects, minimize harms and risks and maximize benefits; respect human dignity, privacy, and autonomy; take special precautions with vulnerable populations; and strive to distribute the benefits and burdens of research fairly.

* Adapted from Shamoo A and Resnik D. 2015. Responsible Conduct of Research, 3rd ed. (New York: Oxford University Press).

Ethical Decision Making in Research

Although codes, policies, and principles are very important and useful, like any set of rules, they do not cover every situation, they often conflict, and they require interpretation. It is therefore important for researchers to learn how to interpret, assess, and apply various research rules and how to make decisions and act ethically in various situations. The vast majority of decisions involve the straightforward application of ethical rules. For example, consider the following case:

The research protocol for a study of a drug on hypertension requires the administration of the drug at different doses to 50 laboratory mice, with chemical and behavioral tests to determine toxic effects. Tom has almost finished the experiment for Dr. Q. He has only 5 mice left to test. However, he really wants to finish his work in time to go to Florida on spring break with his friends, who are leaving tonight. He has injected the drug in all 50 mice but has not completed all of the tests. He therefore decides to extrapolate from the 45 completed results to produce the 5 additional results.

Many different research ethics policies would hold that Tom has acted unethically by fabricating data. If this study were sponsored by a federal agency, such as the NIH, his actions would constitute a form of research misconduct , which the government defines as "fabrication, falsification, or plagiarism" (or FFP). Actions that nearly all researchers classify as unethical are viewed as misconduct. It is important to remember, however, that misconduct occurs only when researchers intend to deceive : honest errors related to sloppiness, poor record keeping, miscalculations, bias, self-deception, and even negligence do not constitute misconduct. Also, reasonable disagreements about research methods, procedures, and interpretations do not constitute research misconduct. Consider the following case:

Dr. T has just discovered a mathematical error in his paper that has been accepted for publication in a journal. The error does not affect the overall results of his research, but it is potentially misleading. The journal has just gone to press, so it is too late to catch the error before it appears in print. In order to avoid embarrassment, Dr. T decides to ignore the error.

Dr. T's error is not misconduct nor is his decision to take no action to correct the error. Most researchers, as well as many different policies and codes would say that Dr. T should tell the journal (and any coauthors) about the error and consider publishing a correction or errata. Failing to publish a correction would be unethical because it would violate norms relating to honesty and objectivity in research.

There are many other activities that the government does not define as "misconduct" but which are still regarded by most researchers as unethical. These are sometimes referred to as " other deviations " from acceptable research practices and include:

  • Publishing the same paper in two different journals without telling the editors
  • Submitting the same paper to different journals without telling the editors
  • Not informing a collaborator of your intent to file a patent in order to make sure that you are the sole inventor
  • Including a colleague as an author on a paper in return for a favor even though the colleague did not make a serious contribution to the paper
  • Discussing with your colleagues confidential data from a paper that you are reviewing for a journal
  • Using data, ideas, or methods you learn about while reviewing a grant or a papers without permission
  • Trimming outliers from a data set without discussing your reasons in paper
  • Using an inappropriate statistical technique in order to enhance the significance of your research
  • Bypassing the peer review process and announcing your results through a press conference without giving peers adequate information to review your work
  • Conducting a review of the literature that fails to acknowledge the contributions of other people in the field or relevant prior work
  • Stretching the truth on a grant application in order to convince reviewers that your project will make a significant contribution to the field
  • Stretching the truth on a job application or curriculum vita
  • Giving the same research project to two graduate students in order to see who can do it the fastest
  • Overworking, neglecting, or exploiting graduate or post-doctoral students
  • Failing to keep good research records
  • Failing to maintain research data for a reasonable period of time
  • Making derogatory comments and personal attacks in your review of author's submission
  • Promising a student a better grade for sexual favors
  • Using a racist epithet in the laboratory
  • Making significant deviations from the research protocol approved by your institution's Animal Care and Use Committee or Institutional Review Board for Human Subjects Research without telling the committee or the board
  • Not reporting an adverse event in a human research experiment
  • Wasting animals in research
  • Exposing students and staff to biological risks in violation of your institution's biosafety rules
  • Sabotaging someone's work
  • Stealing supplies, books, or data
  • Rigging an experiment so you know how it will turn out
  • Making unauthorized copies of data, papers, or computer programs
  • Owning over $10,000 in stock in a company that sponsors your research and not disclosing this financial interest
  • Deliberately overestimating the clinical significance of a new drug in order to obtain economic benefits

These actions would be regarded as unethical by most scientists and some might even be illegal in some cases. Most of these would also violate different professional ethics codes or institutional policies. However, they do not fall into the narrow category of actions that the government classifies as research misconduct. Indeed, there has been considerable debate about the definition of "research misconduct" and many researchers and policy makers are not satisfied with the government's narrow definition that focuses on FFP. However, given the huge list of potential offenses that might fall into the category "other serious deviations," and the practical problems with defining and policing these other deviations, it is understandable why government officials have chosen to limit their focus.

Finally, situations frequently arise in research in which different people disagree about the proper course of action and there is no broad consensus about what should be done. In these situations, there may be good arguments on both sides of the issue and different ethical principles may conflict. These situations create difficult decisions for research known as ethical or moral dilemmas . Consider the following case:

Dr. Wexford is the principal investigator of a large, epidemiological study on the health of 10,000 agricultural workers. She has an impressive dataset that includes information on demographics, environmental exposures, diet, genetics, and various disease outcomes such as cancer, Parkinson’s disease (PD), and ALS. She has just published a paper on the relationship between pesticide exposure and PD in a prestigious journal. She is planning to publish many other papers from her dataset. She receives a request from another research team that wants access to her complete dataset. They are interested in examining the relationship between pesticide exposures and skin cancer. Dr. Wexford was planning to conduct a study on this topic.

Dr. Wexford faces a difficult choice. On the one hand, the ethical norm of openness obliges her to share data with the other research team. Her funding agency may also have rules that obligate her to share data. On the other hand, if she shares data with the other team, they may publish results that she was planning to publish, thus depriving her (and her team) of recognition and priority. It seems that there are good arguments on both sides of this issue and Dr. Wexford needs to take some time to think about what she should do. One possible option is to share data, provided that the investigators sign a data use agreement. The agreement could define allowable uses of the data, publication plans, authorship, etc. Another option would be to offer to collaborate with the researchers.

The following are some step that researchers, such as Dr. Wexford, can take to deal with ethical dilemmas in research:

What is the problem or issue?

It is always important to get a clear statement of the problem. In this case, the issue is whether to share information with the other research team.

What is the relevant information?

Many bad decisions are made as a result of poor information. To know what to do, Dr. Wexford needs to have more information concerning such matters as university or funding agency or journal policies that may apply to this situation, the team's intellectual property interests, the possibility of negotiating some kind of agreement with the other team, whether the other team also has some information it is willing to share, the impact of the potential publications, etc.

What are the different options?

People may fail to see different options due to a limited imagination, bias, ignorance, or fear. In this case, there may be other choices besides 'share' or 'don't share,' such as 'negotiate an agreement' or 'offer to collaborate with the researchers.'

How do ethical codes or policies as well as legal rules apply to these different options?

The university or funding agency may have policies on data management that apply to this case. Broader ethical rules, such as openness and respect for credit and intellectual property, may also apply to this case. Laws relating to intellectual property may be relevant.

Are there any people who can offer ethical advice?

It may be useful to seek advice from a colleague, a senior researcher, your department chair, an ethics or compliance officer, or anyone else you can trust. In the case, Dr. Wexford might want to talk to her supervisor and research team before making a decision.

After considering these questions, a person facing an ethical dilemma may decide to ask more questions, gather more information, explore different options, or consider other ethical rules. However, at some point he or she will have to make a decision and then take action. Ideally, a person who makes a decision in an ethical dilemma should be able to justify his or her decision to himself or herself, as well as colleagues, administrators, and other people who might be affected by the decision. He or she should be able to articulate reasons for his or her conduct and should consider the following questions in order to explain how he or she arrived at his or her decision:

  • Which choice will probably have the best overall consequences for science and society?
  • Which choice could stand up to further publicity and scrutiny?
  • Which choice could you not live with?
  • Think of the wisest person you know. What would he or she do in this situation?
  • Which choice would be the most just, fair, or responsible?

After considering all of these questions, one still might find it difficult to decide what to do. If this is the case, then it may be appropriate to consider others ways of making the decision, such as going with a gut feeling or intuition, seeking guidance through prayer or meditation, or even flipping a coin. Endorsing these methods in this context need not imply that ethical decisions are irrational, however. The main point is that human reasoning plays a pivotal role in ethical decision-making but there are limits to its ability to solve all ethical dilemmas in a finite amount of time.

Promoting Ethical Conduct in Science

globe decorative image

Do U.S. research institutions meet or exceed federal mandates for instruction in responsible conduct of research? A national survey

NCBI Pubmed

 Read about U.S. research instutuins follow federal manadates for ethics in research 

Learn more about NIEHS Research

Most academic institutions in the US require undergraduate, graduate, or postgraduate students to have some education in the responsible conduct of research (RCR) . The NIH and NSF have both mandated training in research ethics for students and trainees. Many academic institutions outside of the US have also developed educational curricula in research ethics

Those of you who are taking or have taken courses in research ethics may be wondering why you are required to have education in research ethics. You may believe that you are highly ethical and know the difference between right and wrong. You would never fabricate or falsify data or plagiarize. Indeed, you also may believe that most of your colleagues are highly ethical and that there is no ethics problem in research..

If you feel this way, relax. No one is accusing you of acting unethically. Indeed, the evidence produced so far shows that misconduct is a very rare occurrence in research, although there is considerable variation among various estimates. The rate of misconduct has been estimated to be as low as 0.01% of researchers per year (based on confirmed cases of misconduct in federally funded research) to as high as 1% of researchers per year (based on self-reports of misconduct on anonymous surveys). See Shamoo and Resnik (2015), cited above.

Clearly, it would be useful to have more data on this topic, but so far there is no evidence that science has become ethically corrupt, despite some highly publicized scandals. Even if misconduct is only a rare occurrence, it can still have a tremendous impact on science and society because it can compromise the integrity of research, erode the public’s trust in science, and waste time and resources. Will education in research ethics help reduce the rate of misconduct in science? It is too early to tell. The answer to this question depends, in part, on how one understands the causes of misconduct. There are two main theories about why researchers commit misconduct. According to the "bad apple" theory, most scientists are highly ethical. Only researchers who are morally corrupt, economically desperate, or psychologically disturbed commit misconduct. Moreover, only a fool would commit misconduct because science's peer review system and self-correcting mechanisms will eventually catch those who try to cheat the system. In any case, a course in research ethics will have little impact on "bad apples," one might argue.

According to the "stressful" or "imperfect" environment theory, misconduct occurs because various institutional pressures, incentives, and constraints encourage people to commit misconduct, such as pressures to publish or obtain grants or contracts, career ambitions, the pursuit of profit or fame, poor supervision of students and trainees, and poor oversight of researchers (see Shamoo and Resnik 2015). Moreover, defenders of the stressful environment theory point out that science's peer review system is far from perfect and that it is relatively easy to cheat the system. Erroneous or fraudulent research often enters the public record without being detected for years. Misconduct probably results from environmental and individual causes, i.e. when people who are morally weak, ignorant, or insensitive are placed in stressful or imperfect environments. In any case, a course in research ethics can be useful in helping to prevent deviations from norms even if it does not prevent misconduct. Education in research ethics is can help people get a better understanding of ethical standards, policies, and issues and improve ethical judgment and decision making. Many of the deviations that occur in research may occur because researchers simply do not know or have never thought seriously about some of the ethical norms of research. For example, some unethical authorship practices probably reflect traditions and practices that have not been questioned seriously until recently. If the director of a lab is named as an author on every paper that comes from his lab, even if he does not make a significant contribution, what could be wrong with that? That's just the way it's done, one might argue. Another example where there may be some ignorance or mistaken traditions is conflicts of interest in research. A researcher may think that a "normal" or "traditional" financial relationship, such as accepting stock or a consulting fee from a drug company that sponsors her research, raises no serious ethical issues. Or perhaps a university administrator sees no ethical problem in taking a large gift with strings attached from a pharmaceutical company. Maybe a physician thinks that it is perfectly appropriate to receive a $300 finder’s fee for referring patients into a clinical trial.

If "deviations" from ethical conduct occur in research as a result of ignorance or a failure to reflect critically on problematic traditions, then a course in research ethics may help reduce the rate of serious deviations by improving the researcher's understanding of ethics and by sensitizing him or her to the issues.

Finally, education in research ethics should be able to help researchers grapple with the ethical dilemmas they are likely to encounter by introducing them to important concepts, tools, principles, and methods that can be useful in resolving these dilemmas. Scientists must deal with a number of different controversial topics, such as human embryonic stem cell research, cloning, genetic engineering, and research involving animal or human subjects, which require ethical reflection and deliberation.

  • Fact sheets
  • Facts in pictures
  • Publications
  • Questions and answers
  • Tools and toolkits
  • Endometriosis
  • Excessive heat
  • Mental disorders
  • Polycystic ovary syndrome
  • All countries
  • Eastern Mediterranean
  • South-East Asia
  • Western Pacific
  • Data by country
  • Country presence 
  • Country strengthening 
  • Country cooperation strategies 
  • News releases
  • Feature stories
  • Press conferences
  • Commentaries
  • Photo library
  • Afghanistan
  • Cholera 
  • Coronavirus disease (COVID-19)
  • Greater Horn of Africa
  • Israel and occupied Palestinian territory
  • Disease Outbreak News
  • Situation reports
  • Weekly Epidemiological Record
  • Surveillance
  • Health emergency appeal
  • International Health Regulations
  • Independent Oversight and Advisory Committee
  • Classifications
  • Data collections
  • Global Health Estimates
  • Mortality Database
  • Sustainable Development Goals
  • Health Inequality Monitor
  • Global Progress
  • World Health Statistics
  • Partnerships
  • Committees and advisory groups
  • Collaborating centres
  • Technical teams
  • Organizational structure
  • Initiatives
  • General Programme of Work
  • WHO Academy
  • Investment in WHO
  • WHO Foundation
  • External audit
  • Financial statements
  • Internal audit and investigations 
  • Programme Budget
  • Results reports
  • Governing bodies
  • World Health Assembly
  • Executive Board
  • Member States Portal
  • Activities /

Ensuring ethical standards and procedures for research with human beings

Research ethics govern the standards of conduct for scientific researchers. It is important to adhere to ethical principles in order to protect the dignity, rights and welfare of research participants. As such, all research involving human beings should be reviewed by an ethics committee to ensure that the appropriate ethical standards are being upheld. Discussion of the ethical principles of beneficence, justice and autonomy are central to ethical review.

WHO works with Member States and partners to promote ethical standards and appropriate systems of review for any course of research involving human subjects. Within WHO, the Research Ethics Review Committee (ERC) ensures that WHO only supports research of the highest ethical standards. The ERC reviews all research projects involving human participants supported either financially or technically by WHO. The ERC is guided in its work by the World Medical Association Declaration of Helsinki (1964), last updated in 2013, as well as the International Ethical Guidelines for Biomedical Research Involving Human Subjects (CIOMS 2016).

WHO releases AI ethics and governance guidance for large multi-modal models

Call for proposals: WHO project on ethical climate and health research

Call for applications: Ethical issues arising in research into health and climate change

Research Ethics Review Committee

lab digital health research south africa

Standards and operational guidance for ethics review of health-related research with...

WHO tool for benchmarking ethics oversight of health-related research involving human participants 

WHO tool for benchmarking ethics oversight of health-related research involving human...

Related activities

Developing normative guidance to address ethical challenges in global health

Supporting countries to manage ethical issues during outbreaks and emergencies

Engaging the global community in health ethics

Building ethics capacity

Framing the ethics of public health surveillance

Related health topics

Global health ethics

Human genome editing

Related teams

Related links

  • International ethical guidelines for biomedical research involving human subjects Council for International Organizations of Medical Sciences. pdf, 1.55Mb
  • International ethical guidelines for epidemiological studies Council for International Organizations of Medical Sciences. pdf, 634Kb
  • World Medical Association: Declaration of Helsinki
  • European Group on Ethics
  • Directive 2001/20/ec of the European Parliament and of the Council pdf, 152Kb
  • Council of Europe (Oviedo Convention - Protocol on biomedical research)
  • Nuffield Council: The ethics of research related to healthcare in developing countries

Annual Review of Ethics Case Studies

What are research ethics cases.

For additional information, please visit Resources for Research Ethics Education

Research Ethics Cases are a tool for discussing scientific integrity. Cases are designed to confront the readers with a specific problem that does not lend itself to easy answers. By providing a focus for discussion, cases help staff involved in research to define or refine their own standards, to appreciate alternative approaches to identifying and resolving ethical problems, and to develop skills for dealing with hard problems on their own.

Research Ethics Cases for Use by the NIH Community

  • Theme 24 – Using AI in Research and Ethical Conduct of Clinical Trials (2024)
  • Theme 23 – Authorship, Collaborations, and Mentoring (2023)
  • Theme 22 – Use of Human Biospecimens and Informed Consent (2022)
  • Theme 21 – Science Under Pressure (2021)
  • Theme 20 – Data, Project and Lab Management, and Communication (2020)
  • Theme 19 – Civility, Harassment and Inappropriate Conduct (2019)
  • Theme 18 – Implicit and Explicit Biases in the Research Setting (2018)
  • Theme 17 – Socially Responsible Science (2017)
  • Theme 16 – Research Reproducibility (2016)
  • Theme 15 – Authorship and Collaborative Science (2015)
  • Theme 14 – Differentiating Between Honest Discourse and Research Misconduct and Introduction to Enhancing Reproducibility (2014)
  • Theme 13 – Data Management, Whistleblowers, and Nepotism (2013)
  • Theme 12 – Mentoring (2012)
  • Theme 11 – Authorship (2011)
  • Theme 10 – Science and Social Responsibility, continued (2010)
  • Theme 9 – Science and Social Responsibility - Dual Use Research (2009)
  • Theme 8 – Borrowing - Is It Plagiarism? (2008)
  • Theme 7 – Data Management and Scientific Misconduct (2007)
  • Theme 6 – Ethical Ambiguities (2006)
  • Theme 5 – Data Management (2005)
  • Theme 4 – Collaborative Science (2004)
  • Theme 3 – Mentoring (2003)
  • Theme 2 – Authorship (2002)
  • Theme 1 – Scientific Misconduct (2001)

For Facilitators Leading Case Discussion

For the sake of time and clarity of purpose, it is essential that one individual have responsibility for leading the group discussion. As a minimum, this responsibility should include:

  • Reading the case aloud.
  • Defining, and re-defining as needed, the questions to be answered.
  • Encouraging discussion that is “on topic”.
  • Discouraging discussion that is “off topic”.
  • Keeping the pace of discussion appropriate to the time available.
  • Eliciting contributions from all members of the discussion group.
  • Summarizing both majority and minority opinions at the end of the discussion.

How Should Cases be Analyzed?

Many of the skills necessary to analyze case studies can become tools for responding to real world problems. Cases, like the real world, contain uncertainties and ambiguities. Readers are encouraged to identify key issues, make assumptions as needed, and articulate options for resolution. In addition to the specific questions accompanying each case, readers should consider the following questions:

  • Who are the affected parties (individuals, institutions, a field, society) in this situation?
  • What interest(s) (material, financial, ethical, other) does each party have in the situation? Which interests are in conflict?
  • Were the actions taken by each of the affected parties acceptable (ethical, legal, moral, or common sense)? If not, are there circumstances under which those actions would have been acceptable? Who should impose what sanction(s)?
  • What other courses of action are open to each of the affected parties? What is the likely outcome of each course of action?
  • For each party involved, what course of action would you take, and why?
  • What actions could have been taken to avoid the conflict?

Is There a Right Answer?

Acceptable solutions.

Most problems will have several acceptable solutions or answers, but it will not always be the case that a perfect solution can be found. At times, even the best solution will still have some unsatisfactory consequences.

Unacceptable Solutions

While more than one acceptable solution may be possible, not all solutions are acceptable. For example, obvious violations of specific rules and regulations or of generally accepted standards of conduct would typically be unacceptable. However, it is also plausible that blind adherence to accepted rules or standards would sometimes be an unacceptable course of action.

Ethical Decision-Making

It should be noted that ethical decision-making is a process rather than a specific correct answer. In this sense, unethical behavior is defined by a failure to engage in the process of ethical decision-making. It is always unacceptable to have made no reasonable attempt to define a consistent and defensible basis for conduct.

This page was last updated on Friday, July 26, 2024

American Psychological Association Logo

This page has been archived and is no longer being updated regularly.

Cover Story

Five principles for research ethics

Cover your bases with these ethical strategies

By DEBORAH SMITH

Monitor Staff

January 2003, Vol 34, No. 1

Print version: page 56

13 min read

  • Conducting Research

Not that long ago, academicians were often cautious about airing the ethical dilemmas they faced in their research and academic work, but that environment is changing today. Psychologists in academe are more likely to seek out the advice of their colleagues on issues ranging from supervising graduate students to how to handle sensitive research data , says George Mason University psychologist June Tangney, PhD.

"There has been a real change in the last 10 years in people talking more frequently and more openly about ethical dilemmas of all sorts," she explains.

Indeed, researchers face an array of ethical requirements: They must meet professional, institutional and federal standards for conducting research with human participants, often supervise students they also teach and have to sort out authorship issues, just to name a few.

Here are five recommendations APA's Science Directorate gives to help researchers steer clear of ethical quandaries:

1. Discuss intellectual property frankly

Academe's competitive "publish-or-perish" mindset can be a recipe for trouble when it comes to who gets credit for authorship . The best way to avoid disagreements about who should get credit and in what order is to talk about these issues at the beginning of a working relationship, even though many people often feel uncomfortable about such topics.

"It's almost like talking about money," explains Tangney. "People don't want to appear to be greedy or presumptuous."

APA's Ethics Code offers some guidance: It specifies that "faculty advisors discuss publication credit with students as early as feasible and throughout the research and publication process as appropriate." When researchers and students put such understandings in writing, they have a helpful tool to continually discuss and evaluate contributions as the research progresses.

However, even the best plans can result in disputes, which often occur because people look at the same situation differently. "While authorship should reflect the contribution," says APA Ethics Office Director Stephen Behnke, JD, PhD, "we know from social science research that people often overvalue their contributions to a project. We frequently see that in authorship-type situations. In many instances, both parties genuinely believe they're right." APA's Ethics Code stipulates that psychologists take credit only for work they have actually performed or to which they have substantially contributed and that publication credit should accurately reflect the relative contributions: "Mere possession of an institutional position, such as department chair, does not justify authorship credit," says the code. "Minor contributions to the research or to the writing for publications are acknowledged appropriately, such as in footnotes or in an introductory statement."

The same rules apply to students. If they contribute substantively to the conceptualization, design, execution, analysis or interpretation of the research reported, they should be listed as authors. Contributions that are primarily technical don't warrant authorship. In the same vein, advisers should not expect ex-officio authorship on their students' work.

Matthew McGue, PhD, of the University of Minnesota, says his psychology department has instituted a procedure to avoid murky authorship issues. "We actually have a formal process here where students make proposals for anything they do on the project," he explains. The process allows students and faculty to more easily talk about research responsibility, distribution and authorship.

Psychologists should also be cognizant of situations where they have access to confidential ideas or research, such as reviewing journal manuscripts or research grants, or hearing new ideas during a presentation or informal conversation. While it's unlikely reviewers can purge all of the information in an interesting manuscript from their thinking, it's still unethical to take those ideas without giving credit to the originator.

"If you are a grant reviewer or a journal manuscript reviewer [who] sees someone's research [that] hasn't been published yet, you owe that person a duty of confidentiality and anonymity," says Gerald P. Koocher, PhD, editor of the journal Ethics and Behavior and co-author of "Ethics in Psychology: Professional Standards and Cases" (Oxford University Press, 1998).

Researchers also need to meet their ethical obligations once their research is published: If authors learn of errors that change the interpretation of research findings, they are ethically obligated to promptly correct the errors in a correction, retraction, erratum or by other means.

To be able to answer questions about study authenticity and allow others to reanalyze the results, authors should archive primary data and accompanying records for at least five years, advises University of Minnesota psychologist and researcher Matthew McGue, PhD. "Store all your data. Don't destroy it," he says. "Because if someone charges that you did something wrong, you can go back."

"It seems simple, but this can be a tricky area," says Susan Knapp, APA's deputy publisher. "The APA Publication Manual Section 8.05 has some general advice on what to retain and suggestions about things to consider in sharing data."

The APA Ethics Code requires psychologists to release their data to others who want to verify their conclusions, provided that participants' confidentiality can be protected and as long as legal rights concerning proprietary data don't preclude their release. However, the code also notes that psychologists who request data in these circumstances can only use the shared data for reanalysis; for any other use, they must obtain a prior written agreement.

2. Be conscious of multiple roles

APA's Ethics Code says psychologists should avoid relationships that could reasonably impair their professional performance or could exploit or harm others. But it also notes that many kinds of multiple relationships aren't unethical--as long as they're not reasonably expected to have adverse effects.

That notwithstanding, psychologists should think carefully before entering into multiple relationships with any person or group, such as recruiting students or clients as participants in research studies or investigating the effectiveness of a product of a company whose stock they own.

For example, when recruiting students from your Psychology 101 course to participate in an experiment, be sure to make clear that participation is voluntary. If participation is a course requirement, be sure to note that in the class syllabus, and ensure that participation has educative value by, for instance, providing a thorough debriefing to enhance students' understanding of the study. The 2002 Ethics Code also mandates in Standard 8.04b that students be given equitable alternatives to participating in research.

Perhaps one of the most common multiple roles for researchers is being both a mentor and lab supervisor to students they also teach in class. Psychologists need to be especially cautious that they don't abuse the power differential between themselves and students, say experts. They shouldn't, for example, use their clout as professors to coerce students into taking on additional research duties.

By outlining the nature and structure of the supervisory relationship before supervision or mentoring begins, both parties can avoid misunderstandings, says George Mason University's Tangney. It's helpful to create a written agreement that includes both parties' responsibilities as well as authorship considerations, intensity of the supervision and other key aspects of the job.

"While that's the ideal situation, in practice we do a lot less of that than we ought to," she notes. "Part of it is not having foresight up front of how a project or research study is going to unfold."

That's why experts also recommend that supervisors set up timely and specific methods to give students feedback and keep a record of the supervision, including meeting times, issues discussed and duties assigned.

If psychologists do find that they are in potentially harmful multiple relationships, they are ethically mandated to take steps to resolve them in the best interest of the person or group while complying with the Ethics Code.

3. Follow informed-consent rules

When done properly, the consent process ensures that individuals are voluntarily participating in the research with full knowledge of relevant risks and benefits.

"The federal standard is that the person must have all of the information that might reasonably influence their willingness to participate in a form that they can understand and comprehend," says Koocher, dean of Simmons College's School for Health Studies.

APA's Ethics Code mandates that psychologists who conduct research should inform participants about:

The purpose of the research, expected duration and procedures.

Participants' rights to decline to participate and to withdraw from the research once it has started, as well as the anticipated consequences of doing so.

Reasonably foreseeable factors that may influence their willingness to participate, such as potential risks, discomfort or adverse effects.

Any prospective research benefits.

Limits of confidentiality, such as data coding, disposal, sharing and archiving, and when confidentiality must be broken.

Incentives for participation.

Who participants can contact with questions.

Experts also suggest covering the likelihood, magnitude and duration of harm or benefit of participation, emphasizing that their involvement is voluntary and discussing treatment alternatives, if relevant to the research.

Keep in mind that the Ethics Code includes specific mandates for researchers who conduct experimental treatment research. Specifically, they must inform individuals about the experimental nature of the treatment, services that will or will not be available to the control groups, how participants will be assigned to treatments and control groups, available treatment alternatives and compensation or monetary costs of participation.

If research participants or clients are not competent to evaluate the risks and benefits of participation themselves--for example, minors or people with cognitive disabilities--then the person who's giving permission must have access to that same information, says Koocher.

Remember that a signed consent form doesn't mean the informing process can be glossed over, say ethics experts. In fact, the APA Ethics Code says psychologists can skip informed consent in two instances only: When permitted by law or federal or institutional regulations, or when the research would not reasonably be expected to distress or harm participants and involves one of the following:

The study of normal educational practices, curricula or classroom management methods conducted in educational settings.

Anonymous questionnaires, naturalistic observations or archival research for which disclosure of responses would not place participants at risk of criminal or civil liability or damage their financial standing, employability or reputation, and for which confidentiality is protected.

The study of factors related to job or organization effectiveness conducted in organizational settings for which there is no risk to participants' employability, and confidentiality is protected.

If psychologists are precluded from obtaining full consent at the beginning--for example, if the protocol includes deception, recording spontaneous behavior or the use of a confederate--they should be sure to offer a full debriefing after data collection and provide people with an opportunity to reiterate their consent, advise experts.

The code also says psychologists should make reasonable efforts to avoid offering "excessive or inappropriate financial or other inducements for research participation when such inducements are likely to coerce participation."

4. Respect confidentiality and privacy

Upholding individuals' rights to confidentiality and privacy is a central tenet of every psychologist's work. However, many privacy issues are idiosyncratic to the research population, writes Susan Folkman, PhD, in " Ethics in Research with Human Participants " (APA, 2000). For instance, researchers need to devise ways to ask whether participants are willing to talk about sensitive topics without putting them in awkward situations, say experts. That could mean they provide a set of increasingly detailed interview questions so that participants can stop if they feel uncomfortable.

And because research participants have the freedom to choose how much information about themselves they will reveal and under what circumstances, psychologists should be careful when recruiting participants for a study, says Sangeeta Panicker, PhD, director of the APA Science Directorate's Research Ethics Office. For example, it's inappropriate to obtain contact information of members of a support group to solicit their participation in research. However, you could give your colleague who facilitates the group a letter to distribute that explains your research study and provides a way for individuals to contact you, if they're interested.

Other steps researchers should take include:

Discuss the limits of confidentiality. Give participants information about how their data will be used, what will be done with case materials, photos and audio and video recordings, and secure their consent.

Know federal and state law. Know the ins and outs of state and federal law that might apply to your research. For instance, the Goals 2000: Education Act of 1994 prohibits asking children about religion, sex or family life without parental permission.

Another example is that, while most states only require licensed psychologists to comply with mandatory reporting laws, some laws also require researchers to report abuse and neglect. That's why it's important for researchers to plan for situations in which they may learn of such reportable offenses. Generally, research psychologists can consult with a clinician or their institution's legal department to decide the best course of action.

Take practical security measures. Be sure confidential records are stored in a secure area with limited access, and consider stripping them of identifying information, if feasible. Also, be aware of situations where confidentiality could inadvertently be breached, such as having confidential conversations in a room that's not soundproof or putting participants' names on bills paid by accounting departments.

Think about data sharing before research begins. If researchers plan to share their data with others, they should note that in the consent process, specifying how they will be shared and whether data will be anonymous. For example, researchers could have difficulty sharing sensitive data they've collected in a study of adults with serious mental illnesses because they failed to ask participants for permission to share the data. Or developmental data collected on videotape may be a valuable resource for sharing, but unless a researcher asked permission back then to share videotapes, it would be unethical to do so. When sharing, psychologists should use established techniques when possible to protect confidentiality, such as coding data to hide identities. "But be aware that it may be almost impossible to entirely cloak identity, especially if your data include video or audio recordings or can be linked to larger databases," says Merry Bullock, PhD, associate executive director in APA's Science Directorate.

Understand the limits of the Internet. Since Web technology is constantly evolving, psychologists need to be technologically savvy to conduct research online and cautious when exchanging confidential information electronically. If you're not a Internet whiz, get the help of someone who is. Otherwise, it may be possible for others to tap into data that you thought was properly protected.

5. Tap into ethics resources

One of the best ways researchers can avoid and resolve ethical dilemmas is to know both what their ethical obligations are and what resources are available to them.

"Researchers can help themselves make ethical issues salient by reminding themselves of the basic underpinnings of research and professional ethics," says Bullock. Those basics include:

The Belmont Report. Released by the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research in 1979, the report provided the ethical framework for ensuing human participant research regulations and still serves as the basis for human participant protection legislation (see Further Reading).

APA's Ethics Code , which offers general principles and specific guidance for research activities.

Moreover, despite the sometimes tense relationship researchers can have with their institutional review boards (IRBs), these groups can often help researchers think about how to address potential dilemmas before projects begin, says Panicker. But psychologists must first give their IRBs the information they need to properly understand a research proposal.

"Be sure to provide the IRB with detailed and comprehensive information about the study, such as the consent process, how participants will be recruited and how confidential information will be protected," says Bullock. "The more information you give your IRB, the better educated its members will become about behavioral research, and the easier it will be for them to facilitate your research."

As cliché as it may be, says Panicker, thinking positively about your interactions with an IRB can help smooth the process for both researchers and the IRBs reviewing their work.

Further reading

American Psychological Association. (2002). Ethical principles of psychologists and code of conduct. American Psychologist, 57 (12).

Sales, B.D., & Folkman, S. (Eds.). (2000). Ethics in research with human participants . Washington, DC: American Psychological Association.

APA's Research Ethics Office in the Science Directorate; e-mail ; Web site: APA Science .

The National Institutes of Health (NIH) offers educational materials on human subjects .

NIH Bioethics Resources Web site .

The Department of Health and Human Services' (DHHS) Office of Research Integrity Web site .

DHHS Office of Human Research Protections Web site .

The 1979 Belmont Report on protecting human subjects .

Association for the Accreditation of Human Research Protection Programs Web site: www.aahrpp.org .

Related Articles

  • Ethics in research with animals

Letters to the Editor

  • Send us a letter

Ethical Considerations In Psychology Research

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Ethics refers to the correct rules of conduct necessary when carrying out research. We have a moral responsibility to protect research participants from harm.

However important the issue under investigation, psychologists must remember that they have a duty to respect the rights and dignity of research participants. This means that they must abide by certain moral principles and rules of conduct.

What are Ethical Guidelines?

In Britain, ethical guidelines for research are published by the British Psychological Society, and in America, by the American Psychological Association. The purpose of these codes of conduct is to protect research participants, the reputation of psychology, and psychologists themselves.

Moral issues rarely yield a simple, unambiguous, right or wrong answer. It is, therefore, often a matter of judgment whether the research is justified or not.

For example, it might be that a study causes psychological or physical discomfort to participants; maybe they suffer pain or perhaps even come to serious harm.

On the other hand, the investigation could lead to discoveries that benefit the participants themselves or even have the potential to increase the sum of human happiness.

Rosenthal and Rosnow (1984) also discuss the potential costs of failing to carry out certain research. Who is to weigh up these costs and benefits? Who is to judge whether the ends justify the means?

Finally, if you are ever in doubt as to whether research is ethical or not, it is worthwhile remembering that if there is a conflict of interest between the participants and the researcher, it is the interests of the subjects that should take priority.

Studies must now undergo an extensive review by an institutional review board (US) or ethics committee (UK) before they are implemented. All UK research requires ethical approval by one or more of the following:

  • Department Ethics Committee (DEC) : for most routine research.
  • Institutional Ethics Committee (IEC) : for non-routine research.
  • External Ethics Committee (EEC) : for research that s externally regulated (e.g., NHS research).

Committees review proposals to assess if the potential benefits of the research are justifiable in light of the possible risk of physical or psychological harm.

These committees may request researchers make changes to the study’s design or procedure or, in extreme cases, deny approval of the study altogether.

The British Psychological Society (BPS) and American Psychological Association (APA) have issued a code of ethics in psychology that provides guidelines for conducting research.  Some of the more important ethical issues are as follows:

Informed Consent

Before the study begins, the researcher must outline to the participants what the research is about and then ask for their consent (i.e., permission) to participate.

An adult (18 years +) capable of being permitted to participate in a study can provide consent. Parents/legal guardians of minors can also provide consent to allow their children to participate in a study.

Whenever possible, investigators should obtain the consent of participants. In practice, this means it is not sufficient to get potential participants to say “Yes.”

They also need to know what it is that they agree to. In other words, the psychologist should, so far as is practicable, explain what is involved in advance and obtain the informed consent of participants.

Informed consent must be informed, voluntary, and rational. Participants must be given relevant details to make an informed decision, including the purpose, procedures, risks, and benefits. Consent must be given voluntarily without undue coercion. And participants must have the capacity to rationally weigh the decision.

Components of informed consent include clearly explaining the risks and expected benefits, addressing potential therapeutic misconceptions about experimental treatments, allowing participants to ask questions, and describing methods to minimize risks like emotional distress.

Investigators should tailor the consent language and process appropriately for the study population. Obtaining meaningful informed consent is an ethical imperative for human subjects research.

The voluntary nature of participation should not be compromised through coercion or undue influence. Inducements should be fair and not excessive/inappropriate.

However, it is not always possible to gain informed consent.  Where the researcher can’t ask the actual participants, a similar group of people can be asked how they would feel about participating.

If they think it would be OK, then it can be assumed that the real participants will also find it acceptable. This is known as presumptive consent.

However, a problem with this method is that there might be a mismatch between how people think they would feel/behave and how they actually feel and behave during a study.

In order for consent to be ‘informed,’ consent forms may need to be accompanied by an information sheet for participants’ setting out information about the proposed study (in lay terms), along with details about the investigators and how they can be contacted.

Special considerations exist when obtaining consent from vulnerable populations with decisional impairments, such as psychiatric patients, intellectually disabled persons, and children/adolescents. Capacity can vary widely so should be assessed individually, but interventions to improve comprehension may help. Legally authorized representatives usually must provide consent for children.

Participants must be given information relating to the following:

  • A statement that participation is voluntary and that refusal to participate will not result in any consequences or any loss of benefits that the person is otherwise entitled to receive.
  • Purpose of the research.
  • All foreseeable risks and discomforts to the participant (if there are any). These include not only physical injury but also possible psychological.
  • Procedures involved in the research.
  • Benefits of the research to society and possibly to the individual human subject.
  • Length of time the subject is expected to participate.
  • Person to contact for answers to questions or in the event of injury or emergency.
  • Subjects” right to confidentiality and the right to withdraw from the study at any time without any consequences.
Debriefing after a study involves informing participants about the purpose, providing an opportunity to ask questions, and addressing any harm from participation. Debriefing serves an educational function and allows researchers to correct misconceptions. It is an ethical imperative.

After the research is over, the participant should be able to discuss the procedure and the findings with the psychologist. They must be given a general idea of what the researcher was investigating and why, and their part in the research should be explained.

Participants must be told if they have been deceived and given reasons why. They must be asked if they have any questions, which should be answered honestly and as fully as possible.

Debriefing should occur as soon as possible and be as full as possible; experimenters should take reasonable steps to ensure that participants understand debriefing.

“The purpose of debriefing is to remove any misconceptions and anxieties that the participants have about the research and to leave them with a sense of dignity, knowledge, and a perception of time not wasted” (Harris, 1998).

The debriefing aims to provide information and help the participant leave the experimental situation in a similar frame of mind as when he/she entered it (Aronson, 1988).

Exceptions may exist if debriefing seriously compromises study validity or causes harm itself, like negative emotions in children. Consultation with an institutional review board guides exceptions.

Debriefing indicates investigators’ commitment to participant welfare. Harms may not be raised in the debriefing itself, so responsibility continues after data collection. Following up demonstrates respect and protects persons in human subjects research.

Protection of Participants

Researchers must ensure that those participating in research will not be caused distress. They must be protected from physical and mental harm. This means you must not embarrass, frighten, offend or harm participants.

Normally, the risk of harm must be no greater than in ordinary life, i.e., participants should not be exposed to risks greater than or additional to those encountered in their normal lifestyles.

The researcher must also ensure that if vulnerable groups are to be used (elderly, disabled, children, etc.), they must receive special care. For example, if studying children, ensure their participation is brief as they get tired easily and have a limited attention span.

Researchers are not always accurately able to predict the risks of taking part in a study, and in some cases, a therapeutic debriefing may be necessary if participants have become disturbed during the research (as happened to some participants in Zimbardo’s prisoners/guards study ).

Deception research involves purposely misleading participants or withholding information that could influence their participation decision. This method is controversial because it limits informed consent and autonomy, but can provide otherwise unobtainable valuable knowledge.

Types of deception include (i) deliberate misleading, e.g. using confederates, staged manipulations in field settings, deceptive instructions; (ii) deception by omission, e.g., failure to disclose full information about the study, or creating ambiguity.

The researcher should avoid deceiving participants about the nature of the research unless there is no alternative – and even then, this would need to be judged acceptable by an independent expert. However, some types of research cannot be carried out without at least some element of deception.

For example, in Milgram’s study of obedience , the participants thought they were giving electric shocks to a learner when they answered a question wrongly. In reality, no shocks were given, and the learners were confederates of Milgram.

This is sometimes necessary to avoid demand characteristics (i.e., the clues in an experiment that lead participants to think they know what the researcher is looking for).

Another common example is when a stooge or confederate of the experimenter is used (this was the case in both the experiments carried out by Asch ).

According to ethics codes, deception must have strong scientific justification, and non-deceptive alternatives should not be feasible. Deception that causes significant harm is prohibited. Investigators should carefully weigh whether deception is necessary and ethical for their research.

However, participants must be deceived as little as possible, and any deception must not cause distress.  Researchers can determine whether participants are likely distressed when deception is disclosed by consulting culturally relevant groups.

Participants should immediately be informed of the deception without compromising the study’s integrity. Reactions to learning of deception can range from understanding to anger. Debriefing should explain the scientific rationale and social benefits to minimize negative reactions.

If the participant is likely to object or be distressed once they discover the true nature of the research at debriefing, then the study is unacceptable.

If you have gained participants’ informed consent by deception, then they will have agreed to take part without actually knowing what they were consenting to.  The true nature of the research should be revealed at the earliest possible opportunity or at least during debriefing.

Some researchers argue that deception can never be justified and object to this practice as it (i) violates an individual’s right to choose to participate; (ii) is a questionable basis on which to build a discipline; and (iii) leads to distrust of psychology in the community.

Confidentiality

Protecting participant confidentiality is an ethical imperative that demonstrates respect, ensures honest participation, and prevents harms like embarrassment or legal issues. Methods like data encryption, coding systems, and secure storage should match the research methodology.

Participants and the data gained from them must be kept anonymous unless they give their full consent.  No names must be used in a lab report .

Researchers must clearly describe to participants the limits of confidentiality and methods to protect privacy. With internet research, threats exist like third-party data access; security measures like encryption should be explained. For non-internet research, other protections should be noted too, like coding systems and restricted data access.

High-profile data breaches have eroded public trust. Methods that minimize identifiable information can further guard confidentiality. For example, researchers can consider whether birthdates are necessary or just ages.

Generally, reducing personal details collected and limiting accessibility safeguards participants. Following strong confidentiality protections demonstrates respect for persons in human subjects research.

What do we do if we discover something that should be disclosed (e.g., a criminal act)? Researchers have no legal obligation to disclose criminal acts and must determine the most important consideration: their duty to the participant vs. their duty to the wider community.

Ultimately, decisions to disclose information must be set in the context of the research aims.

Withdrawal from an Investigation

Participants should be able to leave a study anytime if they feel uncomfortable. They should also be allowed to withdraw their data. They should be told at the start of the study that they have the right to withdraw.

They should not have pressure placed upon them to continue if they do not want to (a guideline flouted in Milgram’s research).

Participants may feel they shouldn’t withdraw as this may ‘spoil’ the study. Many participants are paid or receive course credits; they may worry they won’t get this if they withdraw.

Even at the end of the study, the participant has a final opportunity to withdraw the data they have provided for the research.

Ethical Issues in Psychology & Socially Sensitive Research

There has been an assumption over the years by many psychologists that provided they follow the BPS or APA guidelines when using human participants and that all leave in a similar state of mind to how they turned up, not having been deceived or humiliated, given a debrief, and not having had their confidentiality breached, that there are no ethical concerns with their research.

But consider the following examples:

a) Caughy et al. 1994 found that middle-class children in daycare at an early age generally score less on cognitive tests than children from similar families reared in the home.

Assuming all guidelines were followed, neither the parents nor the children participating would have been unduly affected by this research. Nobody would have been deceived, consent would have been obtained, and no harm would have been caused.

However, consider the wider implications of this study when the results are published, particularly for parents of middle-class infants who are considering placing their young children in daycare or those who recently have!

b)  IQ tests administered to black Americans show that they typically score 15 points below the average white score.

When black Americans are given these tests, they presumably complete them willingly and are not harmed as individuals. However, when published, findings of this sort seek to reinforce racial stereotypes and are used to discriminate against the black population in the job market, etc.

Sieber & Stanley (1988) (the main names for Socially Sensitive Research (SSR) outline 4 groups that may be affected by psychological research: It is the first group of people that we are most concerned with!
  • Members of the social group being studied, such as racial or ethnic group. For example, early research on IQ was used to discriminate against US Blacks.
  • Friends and relatives of those participating in the study, particularly in case studies, where individuals may become famous or infamous. Cases that spring to mind would include Genie’s mother.
  • The research team. There are examples of researchers being intimidated because of the line of research they are in.
  • The institution in which the research is conducted.
salso suggest there are 4 main ethical concerns when conducting SSR:
  • The research question or hypothesis.
  • The treatment of individual participants.
  • The institutional context.
  • How the findings of the research are interpreted and applied.

Ethical Guidelines For Carrying Out SSR

Sieber and Stanley suggest the following ethical guidelines for carrying out SSR. There is some overlap between these and research on human participants in general.

Privacy : This refers to people rather than data. Asking people questions of a personal nature (e.g., about sexuality) could offend.

Confidentiality: This refers to data. Information (e.g., about H.I.V. status) leaked to others may affect the participant’s life.

Sound & valid methodology : This is even more vital when the research topic is socially sensitive. Academics can detect flaws in methods, but the lay public and the media often don’t.

When research findings are publicized, people are likely to consider them fact, and policies may be based on them. Examples are Bowlby’s maternal deprivation studies and intelligence testing.

Deception : Causing the wider public to believe something, which isn’t true by the findings, you report (e.g., that parents are responsible for how their children turn out).

Informed consent : Participants should be made aware of how participating in the research may affect them.

Justice & equitable treatment : Examples of unjust treatment are (i) publicizing an idea, which creates a prejudice against a group, & (ii) withholding a treatment, which you believe is beneficial, from some participants so that you can use them as controls.

Scientific freedom : Science should not be censored, but there should be some monitoring of sensitive research. The researcher should weigh their responsibilities against their rights to do the research.

Ownership of data : When research findings could be used to make social policies, which affect people’s lives, should they be publicly accessible? Sometimes, a party commissions research with their interests in mind (e.g., an industry, an advertising agency, a political party, or the military).

Some people argue that scientists should be compelled to disclose their results so that other scientists can re-analyze them. If this had happened in Burt’s day, there might not have been such widespread belief in the genetic transmission of intelligence. George Miller (Miller’s Magic 7) famously argued that we should give psychology away.

The values of social scientists : Psychologists can be divided into two main groups: those who advocate a humanistic approach (individuals are important and worthy of study, quality of life is important, intuition is useful) and those advocating a scientific approach (rigorous methodology, objective data).

The researcher’s values may conflict with those of the participant/institution. For example, if someone with a scientific approach was evaluating a counseling technique based on a humanistic approach, they would judge it on criteria that those giving & receiving the therapy may not consider important.

Cost/benefit analysis : It is unethical if the costs outweigh the potential/actual benefits. However, it isn’t easy to assess costs & benefits accurately & the participants themselves rarely benefit from research.

Sieber & Stanley advise that researchers should not avoid researching socially sensitive issues. Scientists have a responsibility to society to find useful knowledge.

  • They need to take more care over consent, debriefing, etc. when the issue is sensitive.
  • They should be aware of how their findings may be interpreted & used by others.
  • They should make explicit the assumptions underlying their research so that the public can consider whether they agree with these.
  • They should make the limitations of their research explicit (e.g., ‘the study was only carried out on white middle-class American male students,’ ‘the study is based on questionnaire data, which may be inaccurate,’ etc.
  • They should be careful how they communicate with the media and policymakers.
  • They should be aware of the balance between their obligations to participants and those to society (e.g. if the participant tells them something which they feel they should tell the police/social services).
  • They should be aware of their own values and biases and those of the participants.

Arguments for SSR

  • Psychologists have devised methods to resolve the issues raised.
  • SSR is the most scrutinized research in psychology. Ethical committees reject more SSR than any other form of research.
  • By gaining a better understanding of issues such as gender, race, and sexuality, we are able to gain greater acceptance and reduce prejudice.
  • SSR has been of benefit to society, for example, EWT. This has made us aware that EWT can be flawed and should not be used without corroboration. It has also made us aware that the EWT of children is every bit as reliable as that of adults.
  • Most research is still on white middle-class Americans (about 90% of research is quoted in texts!). SSR is helping to redress the balance and make us more aware of other cultures and outlooks.

Arguments against SSR

  • Flawed research has been used to dictate social policy and put certain groups at a disadvantage.
  • Research has been used to discriminate against groups in society, such as the sterilization of people in the USA between 1910 and 1920 because they were of low intelligence, criminal, or suffered from psychological illness.
  • The guidelines used by psychologists to control SSR lack power and, as a result, are unable to prevent indefensible research from being carried out.

American Psychological Association. (2002). American Psychological Association ethical principles of psychologists and code of conduct. www.apa.org/ethics/code2002.html

Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s” Behavioral study of obedience.”.  American Psychologist ,  19 (6), 421.

Caughy, M. O. B., DiPietro, J. A., & Strobino, D. M. (1994). Day‐care participation as a protective factor in the cognitive development of low‐income children.  Child development ,  65 (2), 457-471.

Harris, B. (1988). Key words: A history of debriefing in social psychology. In J. Morawski (Ed.), The rise of experimentation in American psychology (pp. 188-212). New York: Oxford University Press.

Rosenthal, R., & Rosnow, R. L. (1984). Applying Hamlet’s question to the ethical conduct of research: A conceptual addendum. American Psychologist, 39(5) , 561.

Sieber, J. E., & Stanley, B. (1988). Ethical and professional dimensions of socially sensitive research.  American psychologist ,  43 (1), 49.

The British Psychological Society. (2010). Code of Human Research Ethics. www.bps.org.uk/sites/default/files/documents/code_of_human_research_ethics.pdf

Further Information

  • MIT Psychology Ethics Lecture Slides

BPS Documents

  • Code of Ethics and Conduct (2018)
  • Good Practice Guidelines for the Conduct of Psychological Research within the NHS
  • Guidelines for Psychologists Working with Animals
  • Guidelines for ethical practice in psychological research online

APA Documents

APA Ethical Principles of Psychologists and Code of Conduct

Print Friendly, PDF & Email

  • You are here:
  • American Chemical Society
  • Discover Chemistry
  • Tiny Matters

Pig hearts in people: Xenotransplantation's long history, current promise, and the ethical use of brain-dead people in research

In the early hours of January 7, 2022, David Bennett was out of options. At just 57 years old, he was bedridden, on life support, and in desperate need of a heart transplant for which he was ineligible. Yet Bennett would go on to live for two more months — not with a human heart, but with a heart from a pig. David Bennett was the first case of a pig heart being transplanted into a human, an example of xenotransplantation — when the cells, tissues or organs from one species are transplanted into another. In the United States, over 100,000 kids and adults are currently on the national transplant waiting list, and every day around 17 people on that list die while waiting. 

In today's episode, we cover the science that made Bennett’s transplant possible, and what doctors learned from him that helped the next heart xenotransplant recipient, Lawrence Faucette, live even longer. We also get into some of the ethics conversations surrounding xenotransplantation work — not just questions about the use of animals like pigs and baboons, but experiments with recently deceased, i.e. brain dead, people.

Transcript of this Episode

Sam Jones: In the early hours of January 7, 2022, David Bennett was out of options. At just 57 years old, he was bedridden, on life support, and in desperate need of a heart transplant for which he was ineligible. Yet Bennett would go on to live for two more months — not with a human heart, but with a heart from a pig. 

Welcome to Tiny Matters, I’m Sam Jones and today I’m joined by my co-host, science communicator and producer, George Zaidan. George, welcome to Tiny Matters!

George Zaidan: Thanks so much for having me! I’m especially glad to be here because this is such a fascinating story. David Bennett was the first case of a pig heart being transplanted into a human, an example of xenotransplantation — when the cells, tissues or organs from one species are transplanted into another. In the United States, over 100,000 kids and adults are currently on the national transplant waiting list, and every day around 17 people on that list die while waiting. 

Sam: So today on the show we’re going to talk about the science and history that made Bennett’s transplant possible, and what doctors learned from him that helped the next heart xenotransplant recipient, Lawrence Faucette, live even longer. We’ll also get into some of the ethics conversations surrounding xenotransplantation work — not just questions about the use of animals like pigs and baboons, but experiments with recently deceased people.

Throughout history, you’ll find stories of human-beast hybrids, for instance the ancient Egyptian god Anubis — body of a man with the head of a dog — or the sphinx in Ancient Greece, who was part woman, part bird, part lion. Starting centuries ago, there are records of taking the blood and skin from other animals and using them in humans.

Muhammad Mohiuddin: Even in the early 1600s, they tried to put a dog skull in a nobleman just to repair a defect there. However, the church disallowed it and they took it out and the nobleman died.

George: That’s Muhammad Mohiuddin, a professor of surgery and the Director of the Cardiac Xenotransplantation Program at the University of Maryland School of Medicine. He co-led the team that performed both David Bennett and Lawrence Faucette’s surgeries. 

He told us that in the early 1900s, there was a rise in xenotransplantation attempts using organs from a number of species, but the survival rate was low — like hours, not even days. That’s not surprising given what we now know about the immune system, and the role it plays in an organ being rejected by a transplant recipient’s body after what may have looked like a successful surgery. 

Muhammad has dedicated his life to understanding the immune response to xenotransplantation and how to temper it. And we’re going to get into that a bit later in this episode. But first let’s talk about a few milestones in the xenotransplantation world. Because immunosuppressant drugs weren’t available in the early 1900s, doctors were at a loss, and by the mid 1920s many had moved away from xenotransplantation. And it stayed that way for decades. 

Sam: A big turning point came in 1963, when surgeon Keith Reemtsma at Tulane University transplanted the kidneys of chimpanzees and, in one case, a rhesus monkey, into people. One of the chimpanzee kidney recipients survived an astonishing 9 months. That huge jump in survival time was attributed to new immunosuppressive drugs that kept the immune system from immediately flaring up and rejecting the organ. Unfortunately, the other xenotransplant recipients died within a couple of months, either due to immune rejection or because their immune system was so depleted that they developed an infection they would have typically be able to fight off. 

George: A year later, in 1964, a doctor tried the first cardiac xenotransplantation in humans, using a chimpanzee heart. Unfortunately, the patient died within a couple of hours. But twenty years later, Baby Fae really put cardiac xenotransplantation on the map. In 1984, a surgeon named Leonard Bailey at Loma Linda University Medical Center in California transplanted a baboon heart into 12-day-old Stephanie Fae Beauclair, better known as Baby Fae.

Baby Fae was born with hypoplastic left heart syndrome. This is a condition where the left side of the heart is so underdeveloped that it has trouble pumping blood. In Baby Fae’s case it was so severe that she wouldn’t have survived. As her heart failed, Bailey performed the surgery. Just hours later, her new baboon heart began beating. Sadly, just a few weeks after that, Baby Fae’s immune system rejected the heart and she passed away. But in the years since then, researchers have made huge strides in immunosuppression medications, and treatment approaches and have also moved away from using organs from non-human primates. 

Muhammad Mohiuddin: We found out that chimpanzees or baboons, though very similar to humans, there are several disadvantages of transplanting their organs. 

Sam: One disadvantage is that they carry diseases that can easily transmit to us, including simian immunodeficiency viruses or SIV, notorious for crossing into humans and causing HIV. Muhammad told us the focus soon began to shift to pigs.

Muhammad Mohiuddin: They are domestic animals raised well in captivity, they grow very fast. So for a human of about 80 kilograms, you need a pig of only one year of age, and that heart or other organs will be compatible with human organ size. 

Sam: Researchers also know a lot about the pig genome, which means we can tinker with their genetics. 

Muhammad Mohiuddin: So with the technology now we have to modify genes, we can alter their genes and take out the genes that are immunogenic to humans and then put in some human genes to make them more compatible to human.

Sam: This is a huge deal, because it means pig heart xenotransplant recipients are no longer fully reliant on immunosuppressants. Researchers are able to remove the genes in pigs that code for the production of molecules that trigger the human immune system. At the same time, researchers can now insert genes that make a pig’s heart appear to be more human to our immune system. 

George: This is possible thanks to cloning. The company Muhammad and his colleagues work with is called Revivicor, which is a spinoff of PPL Therapeutics, which is the company that cloned Dolly the sheep. 

Revivicor retrieves eggs from the ovaries of female pigs, removes the DNA and replaces it with new DNA. And in that new DNA, the researchers removed three genes that are responsible for rejection of pig organs by human immune systems. The gene for a growth hormone receptor was also removed to prevent the pig heart from growing too much once transplanted. 

Sam: Six human genes were also inserted into the new DNA, which would help with immune acceptance. They then placed the eggs, now fertilized, back in the pig’s uterus where they developed into embryos. There can still be some variation in genes even with these genetic modifications, but the team is ultimately working to create a stable breeding line where pigs who show stable genetics would then be bred with each other so you’d always get these 10 desired genetic mutations in their offspring. You’d no longer need cloning.

So by 2021, Muhammad and his colleagues had not only genetically modified pigs but were using better immunosuppressants, including a new anti-CD40 antibody, which targets an incredibly important immune pathway in humans.  

Muhammad Mohiuddin: So at that time, we thought that this is the right time to take this to humans and save millions of people throughout the world who cannot get a human heart either because of the shortage or because of certain conditions these patients have, which make them ineligible for a human heart.

George: So they approached the FDA. And although xenotransplantation surgery is not yet approved, it falls under “compassionate use” rules for emergency situations, similar to how new cancer drugs that are not FDA approved can be used in a terminal patient, with their consent of course. And that brings us back to David Bennett. Muhammad told us that, after going back and forth several times with the FDA, they were granted compassionate use approval.

Muhammad Mohiuddin: And then we presented this idea to Mr. Bennett and he graciously accepted it saying that even if it doesn't help me, it may help other people. So he volunteered his life for this purpose.

George: On January 7th, 2022, cardiothoracic surgeon Bartley Griffith, alongside Muhammad and the rest of the team at the University of Maryland Medical Center, performed the first successful xenotransplantation surgery placing a genetically modified pig heart into a person.

Muhammad Mohiuddin: When we were doing this transplant, nobody knew what to expect because there was no precedence. So we even told the patient, and the patient understood that there is not even a guarantee that he will recover from this transplant. Even wake up. So every day from that point on, we took it as success. And finally he lived for 60 days. 

George: Now for context, the first person to receive a heart transplant from another human only lived for 18 days. Over the course of David Bennett’s 60 days before his immune system ultimately rejected the pig heart he not only got more time with his family — Muhammad and the team were also learning from him.

Muhammad Mohiuddin: At one point we had to stop one of the major immunosuppressive drugs for a little bit. We did not know what levels of the CD40 that we used very successfully in baboons is enough for this particular patient, because of course he’s not a baboon. And also there were so many other issues going on simultaneously that we had difficulty maintaining the levels of that drug.

Sam: At one point David Bennett’s immune system took such a dip that they actually needed to give him intravenous immunoglobulins or IVIG, which is an antibody serum from healthy volunteers. 

Muhammad Mohiuddin: But what we didn't realize at that time, that pool serum also had antibodies against pigs. So we believe that those antibodies kind of attack the pig heart and caused the graft to fail.

Sam: In addition, the pig heart David Bennett received was unknowingly infected with a latent virus called CMV.

Muhammad Mohiuddin: We do look for the viruses in these donors. However, we were not able to detect this virus because it was very deep seated. Since this patient, we have developed a lot of new techniques to detect even these deep-seated viruses. And in the second case, we were able to screen that virus out, so we didn’t see any issue with any virus in the second case. 

So if you say, what were the reasons the first patient lived for only 60 days, not a hundred days? I would say the number one was his own condition, a very vulnerable condition where we could not maintain the immunosuppression that we wanted to give to protect the heart from rejection. Number two was the IVIG that we gave acted against the heart. And number three, the virus may have caused some kind of initial damage or immune reaction that may have caused the destruction of the heart cells, causing rejection or graft failure.

Sam: Right now, the team at the University of Maryland is continuing to evaluate patients for future pig heart transplants while also optimizing their approach. 

Muhammad Mohiuddin: Every transplant, we will learn something, but we want to improve and not repeat. Everything that we learned in the first transplant that we thought that had gone wrong, we never repeated it.

Sam: The second patient, 58 year old Lawrence Faucette, received a genetically modified pig heart on September 20, 2023 and lived for nearly 6 weeks before his body rejected it. Faucette, a retired U.S. Navy veteran and histology technician at the NIH, was able to spend that time with family members and even begin physical therapy. 

George: There has been so much work to get to this point, and people like David Bennett and Lawrence Faucette have been invaluable in that progress. But there are also many other species, including pigs but also many non-human primates, that have played an essential role in understanding immune rejection. One of the main reasons the FDA granted compassionate use approval for David Bennett’s surgery was a landmark study done by Muhammad and his team, in which a pig heart was transplanted into a baboon who went on to survive for nearly for 3 more years. 

Sam with Muhammad Mohiuddin: Xenotransplantation is, in my opinion, really important, but it's not like you can just go from pig to human and not do a lot of things in between. And a pig heart, it's still a pig. And some people might get upset that it's a pig. And so, how do you navigate those conversations surrounding, ‘is it ethical to have all of these pigs that are being raised for this purpose or working with baboons?’ I’m just kind of curious how you manage that or how you view that in your work. 

Muhammad Mohiuddin: Every single drug that we use these days, or every single procedure, has been tested in an animal before we used it. So it is unfortunate, we all love animals and we don't want to use them for this purpose, but we don't have any computational models or anything else to replace a live human biology. And I've been doing it for the past 33 years. We do receive a lot of, you'll say hate mails or we've been questioned a lot. But I was very surprised when we did these two humans — 99% of the mail or the communication I received was very positive. A lot of people said, “where were you 10 years ago when we lost our dear one?” It is unfortunate that to keep one human alive, we have to kill one pig. But again, 90,000 pigs are killed per day in the United States for our dietary needs. That's one of the reasons that pigs were chosen because they are already being sacrificed every day for other purposes. There are about 105 products that we use — even maybe the makeup you're using — was derived from pig products. To me, saving one life takes precedence over everything. And just imagine if this becomes a routine. Every 80 minutes a patient dies waiting for an organ. So you can save millions of lives throughout the world, just within a year.

George: But baboons and other non-human primates are not the only option for studying xenotransplant rejection. Another approach is experimenting in people who are recently deceased. In 2022, surgeons at NYU made headlines when they transplanted pig hearts with the same 10 genetic changes as David Bennett’s donor pig heart into two recently deceased people who were then monitored for three days. 

Sam: I came across this side of xenotransplantation research in a story titled, “The Allure and Dangers of Experimenting With Brain-Dead Bodies,” written by Jyoti Madhusoodanan, who is a freelance science journalist based in Portland, Oregon. A couple years before writing the story, Jyoti was working on an article about xenotransplantation for the Journal of the American Medical Association. David Bennett had just received his pig heart transplant.

Jyoti Madhusoodanan: There was a lot of news and excitement about it, and I was speaking with researchers about not just the transplant itself, but this massive body of work they had done leading up to that moment. And in the course of that conversation, it came about that some of that research had been done in people who were recently deceased.

George: Recently deceased people are sometimes referred to as brain dead people or decedents. And although they’re legally dead, machines keep their blood pumping and air flowing into and out of their lungs. What Jyoti soon learned was that using recently deceased people opens up a massive can of worms when it comes to regulation. In the U.S., since 1991 we’ve had the Federal Policy for the Protection of Human Subjects which is also known as the “Common Rule.”

Jyoti Madhusoodanan: And the Common Rule is basically a set of federal policies that are meant to protect people who participate in scientific research. So the common rule covers things like making sure that protected classes of people like children or communities that are especially vulnerable, like people who are in prison or pregnant people, are protected from experiments that could be harmful to them or that might in some way violate their freedoms, for instance.

George: And over time the Common Rule has expanded and shifted to be more encompassing of different kinds of research. 

Jyoti Madhusoodanan: It also covers biospecimens, which are things like blood or tissue or organs. And essentially for all of these things that are done with people who are living, whether they're minors or not, or whether they're tissue samples or not, you have things like informed consent, meaning no one can use your tissue, a blood sample from you for a genetic test or whatever, without your consent. There's also institutional review boards which offer oversight within institutions. 

Sam: Researchers proposing to do work with living people have to get approval from an institutional review board or IRB before moving ahead with a project. 

Jyoti Madhusoodanan: And there's also another set of rules about research involving tissues or bodies of people who are deceased. But what I discovered while reporting this is that recently deceased people, people who've been declared brain dead are in this gray area. So there's not a lot of regulation about how to do research with recently deceased subjects. There are groups of researchers and ethicists who've developed guidance to help the community, but none of that is formal regulation per se.

Sam: And what that does is open things up to a range of treatment, both good and bad.  

Jyoti Madhusoodanan: There were some truly wonderful stories from the U.S. actually, where it was moving to see how much researchers cared about doing things the right way. They sort of adhered to the highest standards they could find because there weren't any other standards for them, which was really heartwarming and wonderful to see. At the same time, there was this one instance from India that really stood out. It started out as a U.S.-based company that wanted to conduct experiments with trying to revive brain dead subjects, and they didn't get the consent they needed in the U.S. So then they moved out of the U.S. to, they say, a few different countries, and didn't really get off the ground because of the pandemic. But there is one institute in India where the researcher says they have been continuing that sort of work on their own without the US company being involved, and they are using a combination of stem cells and other treatments to literally revive brain activity in people who've suffered brain injuries during traffic accidents.

Sam: Although they’re not working in the xenotransplantation space, at least to our knowledge, it gives you a sense of how so much gray area surrounding this regulation means things can get dicey real fast. 

Jyoti Madhusoodanan: All the researchers that I spoke with about that work described it as premature. Ethicists have published review articles, opinion pieces, describing how that work is essentially exploitative of grieving families by giving them false hope that their loved ones might come back to life. And if you contrast that with the xenotransplantation work, where there has been decades of work in animal models to see what needs to be done to make that process feasible for humans. And then they carried out the work in recently deceased subjects and then went into a living human, which is a very methodical, systematic way of bringing the research to a point where it's acceptable to experiment on a human being.

George: The 2022 NYU study where pig hearts were transplanted into two recently deceased people was a great example of decedent work being done ethically. One of the people in that study was a woman named Alva Capuano.

Jyoti Madhusoodanan: Alva Capuano had dealt with so many health issues over the course of her life that really epitomized the need for xenotransplant research. 

George: In her reporting, Jyoti had the opportunity to speak with Alva’s son Tim. 

Jyoti Madhusoodanan: The conversation I had with Tim really framed for me how when research is done well, how it can really build trust with people who participate. His mom had signed up to donate her own organs because of her complicated life experience and knowing the value of a donated organ. And unfortunately, it turned out that because of her complicated medical history, when they were trying to arrange this gift at the end of her life, the family kept running into rejection of people telling them they can't use this organ or that organ or the other. And it was this really traumatizing, grueling process for them. And they were really reaching the end of it, end of that process when they heard about the possibility of her participating in this study, in this experiment.

George: The medical team at NYU explained the process to the Capuano family and they decided it was the right move. Alva wanted to contribute to research that could ultimately save the lives of people like her who were in need of a transplant. 

Jyoti Madhusoodanan: And apparently, Tim said during our interview, that at the time, during those few days that they were conducting the experiment, the researchers would call them. So they had frequent updates about how things were going, what the researchers were learning from what they were doing, and things like that, which is just a really sweet example of how science that engages the people that it hopes to help can do so much more when it's done well.

Sam: In this country, every 8 minutes someone is added to the transplant waiting list. As of March of this year over 3,000 children and adults were waiting for a new heart. And now, cardiac xenotransplantation is no longer some sci-fi pipe dream. There are a lot of patients and doctors out there with a lot of hope. 

Let's tiny show-and-tell.

George: All right, let's do it.

Sam: What do you think, George? First tiny show-and-tell.

George: Yeah, I know. Does that mean I should go first or second?

Sam: You decide.

George: Normally I do rock, paper, scissors, but I'll just go first.

Sam: Okay, go for it.

George: My tiny show-and-tell has to do with a disease called progeria. Have you ever heard of it?

Sam: I've heard of it, but I can't quite remember, so remind me.

George: So it's basically... It's kids with super accelerated aging. So it's like a 10-year-old who looks like a 50-year-old, a 27-year-old who looks like an 85-year-old, that kind of thing. And interestingly, I didn't know this, but it's caused by a single-point mutation in one gene. It's a really, really rare disease. It only affects... I think there's only 18 living patients in the US.

And the thing that was highlighted by this article that I was reading about this is that there are about 7,000 genetic disorders for which we know the mutation. 85% of those disorders are super, super rare, and only a few hundred of the 7,000 currently have any sort of treatment, and progeria is one of those. There was no treatment. So researchers created a protein that actually fixes this point mutation. They tested it in mice. It showed a lot of promise. And this is where you get to guess. Can you guess how they created this protein that fixed the problem in mice?

Sam: How they created it? It's like an enzyme? Did they do something in pigs?

George: I actually don't know what animal they did it in. So this is actually... Maybe you're the wrong person to ask because you know about these things, but I feel like most people would just be like, "CRISPR. They used CRISPR, right?" And the shocker is they did not. They used directed evolution. So not all work being done in genetics and proteins is CRISPR, which I thought was cool.

So the next step is to do a clinical trial in humans, which they want to do in the next two years. And the NIH director, he's one of the labs that did this work. So I thought that was also very, very cool. And that's the limit of my... I'm a chemist by training, so that's... Everything I just told you is the limit of my biology knowledge.

Sam: That's really fascinating. In the episode description, we always link to the article or the paper so that if someone is listening and they want to really deep dive, then they know where to go for it.

George: Great. It's a nature paper, so it'll be fun.

Sam: Today I have something very different for you. I have a cool cicada fact for you that was actually brought to my attention by a colleague's five-year-old daughter named Ellie. So thank you, Ellie. Before I get to Ellie's fact, I'm going to talk a little bit about cicadas. So just bear with me, George. I know you're not a big fan.

George: Yeah, I'm not. Insects are not my jam. But...

George: Go ahead.

Sam: Well, you're going to have to suck it up for a sec. All right. There's of course been a lot of cicada talk in recent years with people particularly excited about the magic cicada genus. Those are the ones that hang out underground in their nymphal stage for up to 17 years and then have this big emergence. They're what are called periodical cicadas, and groups of them called broods will emerge all at once in a specific area and year based on a very predictable cycle of development, which is kind of cool that you can say, "Okay, they're back underground, but they're actually going to come out now in 13 years or 15 years or whatever it may be."

So after female cicadas mate, they go to lay their eggs. And when they do that, they use something called an ovipositor to cut through wood, typically trees, where they then lay those eggs. So this ovipositor kind of looks like a serrated sword and it sticks out the female cicadas abdomen.

George: Love it.

Sam: Yeah. It's a great visual. And you would think to cut through wood, it must be pretty strong, right? So a few years back, researchers hypothesized that it might contain different inorganic elements to make it strong, including different metals. So they used a couple different techniques. They used energy dispersive, X-ray spectroscopy and electron microscopy to identify and quantify the elements that were actually present in the ovipositor and then be able to map their locations as well. And so they found 14 inorganic elements including silicon, iron, and zinc. So cicadas are part metal, and that was Ellie's fact.

George: That's amazing.

Sam: And then I went a little bit deeper, but don't worry, I'm almost done with these insects. So something else that I really found fascinating was that a lot of these, what are called cuticles on insects, so this is technically what it’s called. The ovipositor is a cuticle. A lot of them are reinforced with metal, including spider fangs,

George: Oh no.

Sam: Insect mandibles, which are the appendages that are near the mouth that help them crush or bite or cut things, and also, the jaws of marine polychaetes, which are these very creepy-looking worms that you find in the water. So just a fun little fact. I hope you sleep well tonight, George.

George: I guarantee you I will not. Thanks, Sam.

Thanks for tuning in to this week’s episode of Tiny Matters, a production of the American Chemical Society. This week’s script was written by Sam, who is also our executive producer, and was edited by me, George Zaidan, and by Michael David. It was fact-checked by Michelle Boucher. The Tiny Matters theme and episode sound design is by Michael Simonelli and the Charts & Leisure team. 

Sam: Thanks so much to Muhammad Mohiuddin and Jyoti Madhusoodanan for joining us. To be featured in our bonus series, “Tiny Show and Tell Us,” write in to tinymatters@acs.org with science news you’re itching to share, a science factoid you love telling friends about, or maybe even a personal science story. We want to hear about it! And while you’re at it, subscribe to our newsletter! I’ve put links in the episode description. See ya next time!

George: Please don’t share insect stories anymore.

Sam: Only insect tiny show and tells from now on.

George: No insect stories…

Listen and subscribe

research ethical problems

iHeartRadio

Accept & Close The ACS takes your privacy seriously as it relates to cookies. We use cookies to remember users, better understand ways to serve them, improve our value proposition, and optimize their experience. Learn more about managing your cookies at Cookies Policy .

1155 Sixteenth Street, NW, Washington, DC 20036, USA |  service@acs.org  | 1-800-333-9511 (US and Canada) | 614-447-3776 (outside North America)

  • Terms of Use
  • Accessibility

Copyright © 2024 American Chemical Society

Ethical Issues in Research: Perceptions of Researchers, Research Ethics Board Members and Research Ethics Experts

  • Published: 12 August 2022
  • Volume 21 , pages 269–292, ( 2023 )

Cite this article

research ethical problems

  • Marie-Josée Drolet   ORCID: orcid.org/0000-0001-8384-4193 1 ,
  • Eugénie Rose-Derouin 2 ,
  • Julie-Claude Leblanc 2 ,
  • Mélanie Ruest 2 &
  • Bryn Williams-Jones 3  

33k Accesses

11 Citations

4 Altmetric

Explore all metrics

In the context of academic research, a diversity of ethical issues, conditioned by the different roles of members within these institutions, arise. Previous studies on this topic addressed mainly the perceptions of researchers. However, to our knowledge, no studies have explored the transversal ethical issues from a wider spectrum, including other members of academic institutions as the research ethics board (REB) members, and the research ethics experts. The present study used a descriptive phenomenological approach to document the ethical issues experienced by a heterogeneous group of Canadian researchers, REB members, and research ethics experts. Data collection involved socio-demographic questionnaires and individual semi-structured interviews. Following the triangulation of different perspectives (researchers, REB members and ethics experts), emerging ethical issues were synthesized in ten units of meaning: (1) research integrity, (2) conflicts of interest, (3) respect for research participants, (4) lack of supervision and power imbalances, (5) individualism and performance, (6) inadequate ethical guidance, (7) social injustices, (8) distributive injustices, (9) epistemic injustices, and (10) ethical distress. This study highlighted several problematic elements that can support the identification of future solutions to resolve transversal ethical issues in research that affect the heterogeneous members of the academic community.

Similar content being viewed by others

research ethical problems

Ethical Research? Examining Knotty, Moment-to-Moment Challenges Throughout the Research Process

research ethical problems

Approaching Research in a Prepared, Mindful and Ethical Manner

research ethical problems

Conceptualising Ethical Issues in the Conduct of Research: Results from a Critical and Systematic Literature Review

Explore related subjects.

  • Artificial Intelligence
  • Medical Ethics

Avoid common mistakes on your manuscript.

Introduction

Research includes a set of activities in which researchers use various structured methods to contribute to the development of knowledge, whether this knowledge is theoretical, fundamental, or applied (Drolet & Ruest, accepted ). University research is carried out in a highly competitive environment that is characterized by ever-increasing demands (i.e., on time, productivity), insufficient access to research funds, and within a market economy that values productivity and speed often to the detriment of quality or rigour – this research context creates a perfect recipe for breaches in research ethics, like research misbehaviour or misconduct (i.e., conduct that is ethically questionable or unacceptable because it contravenes the accepted norms of responsible conduct of research or compromises the respect of core ethical values that are widely held by the research community) (Drolet & Girard, 2020 ; Sieber, 2004 ). Problematic ethics and integrity issues – e.g., conflicts of interest, falsification of data, non-respect of participants’ rights, and plagiarism, to name but a few – have the potential to both undermine the credibility of research and lead to negative consequences for many stakeholders, including researchers, research assistants and personnel, research participants, academic institutions, and society as a whole (Drolet & Girard, 2020 ). It is thus evident that the academic community should be able to identify these different ethical issues in order to evaluate the nature of the risks that they pose (and for whom), and then work towards their prevention or management (i.e., education, enhanced policies and procedures, risk mitigation strategies).

In this article, we define an “ethical issue” as any situation that may compromise, in whole or in part, the respect of at least one moral value (Swisher et al., 2005 ) that is considered socially legitimate and should thus be respected. In general, ethical issues occur at three key moments or stages of the research process: (1) research design (i.e., conception, project planning), (2) research conduct (i.e., data collection, data analysis) and (3) knowledge translation or communication (e.g., publications of results, conferences, press releases) (Drolet & Ruest, accepted ). According to Sieber ( 2004 ), ethical issues in research can be classified into five categories, related to: (a) communication with participants and the community, (b) acquisition and use of research data, (c) external influence on research, (d) risks and benefits of the research, and (e) selection and use of research theories and methods. Many of these issues are related to breaches of research ethics norms, misbehaviour or research misconduct. Bruhn et al., ( 2002 ) developed a typology of misbehaviour and misconduct in academia that can be used to judge the seriousness of different cases. This typology takes into consideration two axes of reflection: (a) the origin of the situation (i.e., is it the researcher’s own fault or due to the organizational context?), and (b) the scope and severity (i.e., is this the first instance or a recurrent behaviour? What is the nature of the situation? What are the consequences, for whom, for how many people, and for which organizations?).

A previous detailed review of the international literature on ethical issues in research revealed several interesting findings (Beauchemin et al., 2021 ). Indeed, the current literature is dominated by descriptive ethics, i.e., the sharing by researchers from various disciplines of the ethical issues they have personally experienced. While such anecdotal documentation is relevant, it is insufficient because it does not provide a global view of the situation. Among the reviewed literature, empirical studies were in the minority (Table  1 ) – only about one fifth of the sample (n = 19) presented empirical research findings on ethical issues in research. The first of these studies was conducted almost 50 years ago (Hunt et al., 1984 ), with the remainder conducted in the 1990s. Eight studies were conducted in the United States (n = 8), five in Canada (n = 5), three in England (n = 3), two in Sweden (n = 2) and one in Ghana (n = 1).

Further, the majority of studies in our sample (n = 12) collected the perceptions of a homogeneous group of participants, usually researchers (n = 14) and sometimes health professionals (n = 6). A minority of studies (n = 7) triangulated the perceptions of diverse research stakeholders (i.e., researchers and research participants, or students). To our knowledge, only one study has examined perceptions of ethical issues in research by research ethics board members (REB; Institutional Review Boards [IRB] in the USA), and none to date have documented the perceptions of research ethics experts. Finally, nine studies (n = 9) adopted a qualitative design, seven studies (n = 7) a quantitative design, and three (n = 3) a mixed-methods design.

More studies using empirical research methods are needed to better identify broader trends, to enrich discussions on the values that should govern responsible conduct of research in the academic community, and to evaluate the means by which these values can be supported in practice (Bahn, 2012 ; Beauchemin et al., 2021 ; Bruhn et al., 2002 ; Henderson et al., 2013 ; Resnik & Elliot, 2016; Sieber 2004 ). To this end, we conducted an empirical qualitative study to document the perceptions and experiences of a heterogeneous group of Canadian researchers, REB members, and research ethics experts, to answer the following broad question: What are the ethical issues in research?

Research Methods

Research design.

A qualitative research approach involving individual semi-structured interviews was used to systematically document ethical issues (De Poy & Gitlin, 2010 ; Hammell et al., 2000 ). Specifically, a descriptive phenomenological approach inspired by the philosophy of Husserl was used (Husserl, 1970 , 1999 ), as it is recommended for documenting the perceptions of ethical issues raised by various practices (Hunt & Carnavale, 2011 ).

Ethical considerations

The principal investigator obtained ethics approval for this project from the Research Ethics Board of the Université du Québec à Trois-Rivières (UQTR). All members of the research team signed a confidentiality agreement, and research participants signed the consent form after reading an information letter explaining the nature of the research project.

Sampling and recruitment

As indicated above, three types of participants were sought: (1) researchers from different academic disciplines conducting research (i.e., theoretical, fundamental or empirical) in Canadian universities; (2) REB members working in Canadian organizations responsible for the ethical review, oversight or regulation of research; and (3) research ethics experts, i.e., academics or ethicists who teach research ethics, conduct research in research ethics, or are scholars who have acquired a specialization in research ethics. To be included in the study, participants had to work in Canada, speak and understand English or French, and be willing to participate in the study. Following Thomas and Polio’s (2002) recommendation to recruit between six and twelve participants (for a homogeneous sample) to ensure data saturation, for our heterogeneous sample, we aimed to recruit approximately twelve participants in order to obtain data saturation. Having used this method several times in related projects in professional ethics, data saturation is usually achieved with 10 to 15 participants (Drolet & Goulet, 2018 ; Drolet & Girard, 2020 ; Drolet et al., 2020 ). From experience, larger samples only serve to increase the degree of data saturation, especially in heterogeneous samples (Drolet et al., 2017 , 2019 ; Drolet & Maclure, 2016 ).

Purposive sampling facilitated the identification of participants relevant to documenting the phenomenon in question (Fortin, 2010 ). To ensure a rich and most complete representation of perceptions, we sought participants with varied and complementary characteristics with regards to the social roles they occupy in research practice (Drolet & Girard, 2020 ). A triangulation of sources was used for the recruitment (Bogdan & Biklen, 2006 ). The websites of Canadian universities and Canadian health institution REBs, as well as those of major Canadian granting agencies (i.e., the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada, and the Social Sciences and Humanities Research Council of Canada, Fonds de recherche du Quebec), were searched to identify individuals who might be interested in participating in the study. Further, people known by the research team for their knowledge and sensitivity to ethical issues in research were asked to participate. Research participants were also asked to suggest other individuals who met the study criteria.

Data Collection

Two tools were used for data collecton: (a) a socio-demographic questionnaire, and (b) a semi-structured individual interview guide. English and French versions of these two documents were used and made available, depending on participant preferences. In addition, although the interview guide contained the same questions, they were adapted to participants’ specific roles (i.e., researcher, REB member, research ethics expert). When contacted by email by the research assistant, participants were asked to confirm under which role they wished to participate (because some participants might have multiple, overlapping responsibilities) and they were sent the appropriate interview guide.

The interview guides each had two parts: an introduction and a section on ethical issues. The introduction consisted of general questions to put the participant at ease (i.e., “Tell me what a typical day at work is like for you”). The section on ethical issues was designed to capture the participant’s perceptions through questions such as: “Tell me three stories you have experienced at work that involve an ethical issue?” and “Do you feel that your organization is doing enough to address, manage, and resolve ethical issues in your work?”. Although some interviews were conducted in person, the majority were conducted by videoconference to promote accessibility and because of the COVID-19 pandemic. Interviews were digitally recorded so that the verbatim could be transcribed in full, and varied between 40 and 120 min in duration, with an average of 90 min. Research assistants conducted the interviews and transcribed the verbatim.

Data Analysis

The socio-demographic questionnaires were subjected to simple descriptive statistical analyses (i.e., means and totals), and the semi-structured interviews were subjected to qualitative analysis. The steps proposed by Giorgi ( 1997 ) for a Husserlian phenomenological reduction of the data were used. After collecting, recording, and transcribing the interviews, all verbatim were analyzed by at least two analysts: a research assistant (2nd author of this article) and the principal investigator (1st author) or a postdoctoral fellow (3rd author). The repeated reading of the verbatim allowed the first analyst to write a synopsis, i.e., an initial extraction of units of meaning. The second analyst then read the synopses, which were commented and improved if necessary. Agreement between analysts allowed the final drafting of the interview synopses, which were then analyzed by three analysts to generate and organize the units of meaning that emerged from the qualitative data.

Participants

Sixteen individuals (n = 16) participated in the study, of whom nine (9) identified as female and seven (7) as male (Table  2 ). Participants ranged in age from 22 to 72 years, with a mean age of 47.5 years. Participants had between one (1) and 26 years of experience in the research setting, with an average of 14.3 years of experience. Participants held a variety of roles, including: REB members (n = 11), researchers (n = 10), research ethics experts (n = 4), and research assistant (n = 1). As mentioned previously, seven (7) participants held more than one role, i.e., REB member, research ethics expert, and researcher. The majority (87.5%) of participants were working in Quebec, with the remaining working in other Canadian provinces. Although all participants considered themselves to be francophone, one quarter (n = 4) identified themselves as belonging to a cultural minority group.

With respect to their academic background, most participants (n = 9) had a PhD, three (3) had a post-doctorate, two (2) had a master’s degree, and two (2) had a bachelor’s degree. Participants came from a variety of disciplines: nine (9) had a specialty in the humanities or social sciences, four (4) in the health sciences and three (3) in the natural sciences. In terms of their knowledge of ethics, five (5) participants reported having taken one university course entirely dedicated to ethics, four (4) reported having taken several university courses entirely dedicated to ethics, three (3) had a university degree dedicated to ethics, while two (2) only had a few hours or days of training in ethics and two (2) reported having no knowledge of ethics.

  • Ethical issues

As Fig.  1 illustrates, ten units of meaning emerge from the data analysis, namely: (1) research integrity, (2) conflicts of interest, (3) respect for research participants, (4) lack of supervision and power imbalances, (5) individualism and performance, (6) inadequate ethical guidance, (7) social injustices, (8) distributive injustices, (9) epistemic injustices, and (10) ethical distress. To illustrate the results, excerpts from verbatim interviews are presented in the following sub-sections. Most of the excerpts have been translated into English as the majority of interviews were conducted with French-speaking participants.

figure 1

Ethical issues in research according to the participants

Research Integrity

The research environment is highly competitive and performance-based. Several participants, in particular researchers and research ethics experts, felt that this environment can lead both researchers and research teams to engage in unethical behaviour that reflects a lack of research integrity. For example, as some participants indicated, competition for grants and scientific publications is sometimes so intense that researchers falsify research results or plagiarize from colleagues to achieve their goals.

Some people will lie or exaggerate their research findings in order to get funding. Then, you see it afterwards, you realize: “ah well, it didn’t work, but they exaggerated what they found and what they did” (participant 14). Another problem in research is the identification of authors when there is a publication. Very often, there are authors who don’t even know what the publication is about and that their name is on it. (…) The time that it surprised me the most was just a few months ago when I saw someone I knew who applied for a teaching position. He got it I was super happy for him. Then I looked at his publications and … there was one that caught my attention much more than the others, because I was in it and I didn’t know what that publication was. I was the second author of a publication that I had never read (participant 14). I saw a colleague who had plagiarized another colleague. [When the colleague] found out about it, he complained. So, plagiarism is a serious [ethical breach]. I would also say that there is a certain amount of competition in the university faculties, especially for grants (…). There are people who want to win at all costs or get as much as possible. They are not necessarily going to consider their colleagues. They don’t have much of a collegial spirit (participant 10).

These examples of research misbehaviour or misconduct are sometimes due to or associated with situations of conflicts of interest, which may be poorly managed by certain researchers or research teams, as noted by many participants.

Conflict of interest

The actors and institutions involved in research have diverse interests, like all humans and institutions. As noted in Chap. 7 of the Canadian Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (TCPS2, 2018),

“researchers and research students hold trust relationships, either directly or indirectly, with participants, research sponsors, institutions, their professional bodies and society. These trust relationships can be put at risk by conflicts of interest that may compromise independence, objectivity or ethical duties of loyalty. Although the potential for such conflicts has always existed, pressures on researchers (i.e., to delay or withhold dissemination of research outcomes or to use inappropriate recruitment strategies) heighten concerns that conflicts of interest may affect ethical behaviour” (p. 92).

The sources of these conflicts are varied and can include interpersonal conflicts, financial partnerships, third-party pressures, academic or economic interests, a researcher holding multiple roles within an institution, or any other incentive that may compromise a researcher’s independence, integrity, and neutrality (TCPS2, 2018). While it is not possible to eliminate all conflicts of interest, it is important to manage them properly and to avoid temptations to behave unethically.

Ethical temptations correspond to situations in which people are tempted to prioritize their own interests to the detriment of the ethical goods that should, in their own context, govern their actions (Swisher et al., 2005 ). In the case of researchers, this refers to situations that undermine independence, integrity, neutrality, or even the set of principles that govern research ethics (TCPS2, 2018) or the responsible conduct of research. According to study participants, these types of ethical issues frequently occur in research. Many participants, especially researchers and REB members, reported that conflicts of interest can arise when members of an organization make decisions to obtain large financial rewards or to increase their academic profile, often at the expense of the interests of members of their research team, research participants, or even the populations affected by their research.

A company that puts money into making its drug work wants its drug to work. So, homeopathy is a good example, because there are not really any consequences of homeopathy, there are not very many side effects, because there are no effects at all. So, it’s not dangerous, but it’s not a good treatment either. But some people will want to make it work. And that’s a big issue when you’re sitting at a table and there are eight researchers, and there are two or three who are like that, and then there are four others who are neutral, and I say to myself, this is not science. I think that this is a very big ethical issue (participant 14). There are also times in some research where there will be more links with pharmaceutical companies. Obviously, there are then large amounts of money that will be very interesting for the health-care institutions because they still receive money for clinical trials. They’re still getting some compensation because its time consuming for the people involved and all that. The pharmaceutical companies have money, so they will compensate, and that is sometimes interesting for the institutions, and since we are a bit caught up in this, in the sense that we have no choice but to accept it. (…) It may not be the best research in the world, there may be a lot of side effects due to the drugs, but it’s good to accept it, we’re going to be part of the clinical trial (participant 3). It is integrity, what we believe should be done or said. Often by the pressure of the environment, integrity is in tension with the pressures of the environment, so it takes resistance, it takes courage in research. (…) There were all the debates there about the problems of research that was funded and then the companies kept control over what was written. That was really troubling for a lot of researchers (participant 5).

Further, these situations sometimes have negative consequences for research participants as reported by some participants.

Respect for research participants

Many research projects, whether they are psychosocial or biomedical in nature, involve human participants. Relationships between the members of research teams and their research participants raise ethical issues that can be complex. Research projects must always be designed to respect the rights and interests of research participants, and not just those of researchers. However, participants in our study – i.e., REB members, researchers, and research ethics experts – noted that some research teams seem to put their own interests ahead of those of research participants. They also emphasized the importance of ensuring the respect, well-being, and safety of research participants. The ethical issues related to this unit of meaning are: respect for free, informed and ongoing consent of research participants; respect for and the well-being of participants; data protection and confidentiality; over-solicitation of participants; ownership of the data collected on participants; the sometimes high cost of scientific innovations and their accessibility; balance between the social benefits of research and the risks to participants (particularly in terms of safety); balance between collective well-being (development of knowledge) and the individual rights of participants; exploitation of participants; paternalism when working with populations in vulnerable situations; and the social acceptability of certain types of research. The following excerpts present some of these issues.

Where it disturbs me ethically is in the medical field – because it’s more in the medical field that we’re going to see this – when consent forms are presented to patients to solicit them as participants, and then [these forms] have an average of 40 pages. That annoys me. When they say that it has to be easy to understand and all that, adapted to the language, and then the hyper-technical language plus there are 40 pages to read, I don’t understand how you’re going to get informed consent after reading 40 pages. (…) For me, it doesn’t work. I read them to evaluate them and I have a certain level of education and experience in ethics, and there are times when I don’t understand anything (participant 2). There is a lot of pressure from researchers who want to recruit research participants (…). The idea that when you enter a health care institution, you become a potential research participant, when you say “yes to a research, you check yes to all research”, then everyone can ask you. I think that researchers really have this fantasy of saying to themselves: “as soon as people walk through the door of our institution, they become potential participants with whom we can communicate and get them involved in all projects”. There’s a kind of idea that, yes, it can be done, but it has to be somewhat supervised to avoid over-solicitation (…). Researchers are very interested in facilitating recruitment and making it more fluid, but perhaps to the detriment of confidentiality, privacy, and respect; sometimes that’s what it is, to think about what type of data you’re going to have in your bank of potential participants? Is it just name and phone number or are you getting into more sensitive information? (participant 9).

In addition, one participant reported that their university does not provide the resources required to respect the confidentiality of research participants.

The issue is as follows: researchers, of course, commit to protecting data with passwords and all that, but we realize that in practice, it is more difficult. It is not always as protected as one might think, because professor-researchers will run out of space. Will the universities make rooms available to researchers, places where they can store these things, especially when they have paper documentation, and is there indeed a guarantee of confidentiality? Some researchers have told me: “Listen; there are even filing cabinets in the corridors”. So, that certainly poses a concrete challenge. How do we go about challenging the administrative authorities? Tell them it’s all very well to have an ethics committee, but you have to help us, you also have to make sure that the necessary infrastructures are in place so that what we are proposing is really put into practice (participant 4).

If the relationships with research participants are likely to raise ethical issues, so too are the relationships with students, notably research assistants. On this topic, several participants discussed the lack of supervision or recognition offered to research assistants by researchers as well as the power imbalances between members of the research team.

Lack of Supervision and Power Imbalances

Many research teams are composed not only of researchers, but also of students who work as research assistants. The relationship between research assistants and other members of research teams can sometimes be problematic and raise ethical issues, particularly because of the inevitable power asymmetries. In the context of this study, several participants – including a research assistant, REB members, and researchers – discussed the lack of supervision or recognition of the work carried out by students, psychological pressure, and the more or less well-founded promises that are sometimes made to students. Participants also mentioned the exploitation of students by certain research teams, which manifest when students are inadequately paid, i.e., not reflective of the number of hours actually worked, not a fair wage, or even a wage at all.

[As a research assistant], it was more of a feeling of distress that I felt then because I didn’t know what to do. (…) I was supposed to get coaching or be supported, but I didn’t get anything in the end. It was like, “fix it by yourself”. (…) All research assistants were supposed to be supervised, but in practice they were not (participant 1). Very often, we have a master’s or doctoral student that we put on a subject and we consider that the project will be well done, while the student is learning. So, it happens that the student will do a lot of work and then we realize that the work is poorly done, and it is not necessarily the student’s fault. He wasn’t necessarily well supervised. There are directors who have 25 students, and they just don’t supervise them (participant 14). I think it’s really the power relationship. I thought to myself, how I saw my doctorate, the beginning of my research career, I really wanted to be in that laboratory, but they are the ones who are going to accept me or not, so what do I do to be accepted? I finally accept their conditions [which was to work for free]. If these are the conditions that are required to enter this lab, I want to go there. So, what do I do, well I accepted. It doesn’t make sense, but I tell myself that I’m still privileged, because I don’t have so many financial worries, one more reason to work for free, even though it doesn’t make sense (participant 1). In research, we have research assistants. (…). The fact of using people… so that’s it, you have to take into account where they are, respect them, but at the same time they have to show that they are there for the research. In English, we say “carry” or take care of people. With research assistants, this is often a problem that I have observed: for grant machines, the person is the last to be found there. Researchers, who will take, use student data, without giving them the recognition for it (participant 5). The problem at our university is that they reserve funding for Canadian students. The doctoral clientele in my field is mostly foreign students. So, our students are poorly funded. I saw one student end up in the shelter, in a situation of poverty. It ended very badly for him because he lacked financial resources. Once you get into that dynamic, it’s very hard to get out. I was made aware of it because the director at the time had taken him under her wing and wanted to try to find a way to get him out of it. So, most of my students didn’t get funded (participant 16). There I wrote “manipulation”, but it’s kind of all promises all the time. I, for example, was promised a lot of advancement, like when I got into the lab as a graduate student, it was said that I had an interest in [this particular area of research]. I think there are a lot of graduate students who must have gone through that, but it is like, “Well, your CV has to be really good, if you want to do a lot of things and big things. If you do this, if you do this research contract, the next year you could be the coordinator of this part of the lab and supervise this person, get more contracts, be paid more. Let’s say: you’ll be invited to go to this conference, this big event”. They were always dangling something, but you have to do that first to get there. But now, when you’ve done that, you have to do this business. It’s like a bit of manipulation, I think. That was very hard to know who is telling the truth and who is not (participant 1).

These ethical issues have significant negative consequences for students. Indeed, they sometimes find themselves at the mercy of researchers, for whom they work, struggling to be recognized and included as authors of an article, for example, or to receive the salary that they are due. For their part, researchers also sometimes find themselves trapped in research structures that can negatively affect their well-being. As many participants reported, researchers work in organizations that set very high productivity standards and in highly competitive contexts, all within a general culture characterized by individualism.

Individualism and performance

Participants, especially researchers, discussed the culture of individualism and performance that characterizes the academic environment. In glorifying excellence, some universities value performance and productivity, often at the expense of psychological well-being and work-life balance (i.e., work overload and burnout). Participants noted that there are ethical silences in their organizations on this issue, and that the culture of individualism and performance is not challenged for fear of retribution or simply to survive, i.e., to perform as expected. Participants felt that this culture can have a significant negative impact on the quality of the research conducted, as research teams try to maximize the quantity of their work (instead of quality) in a highly competitive context, which is then exacerbated by a lack of resources and support, and where everything must be done too quickly.

The work-life balance with the professional ethics related to work in a context where you have too much and you have to do a lot, it is difficult to balance all that and there is a lot of pressure to perform. If you don’t produce enough, that’s it; after that, you can’t get any more funds, so that puts pressure on you to do more and more and more (participant 3). There is a culture, I don’t know where it comes from, and that is extremely bureaucratic. If you dare to raise something, you’re going to have many, many problems. They’re going to make you understand it. So, I don’t talk. It is better: your life will be easier. I think there are times when you have to talk (…) because there are going to be irreparable consequences. (…) I’m not talking about a climate of terror, because that’s exaggerated, it’s not true, people are not afraid. But people close their office door and say nothing because it’s going to make their work impossible and they’re not going to lose their job, they’re not going to lose money, but researchers need time to be focused, so they close their office door and say nothing (participant 16).

Researchers must produce more and more, and they feel little support in terms of how to do such production, ethically, and how much exactly they are expected to produce. As this participant reports, the expectation is an unspoken rule: more is always better.

It’s sometimes the lack of a clear line on what the expectations are as a researcher, like, “ah, we don’t have any specific expectations, but produce, produce, produce, produce.” So, in that context, it’s hard to be able to put the line precisely: “have I done enough for my work?” (participant 3).

Inadequate ethical Guidance

While the productivity expectation is not clear, some participants – including researchers, research ethics experts, and REB members – also felt that the ethical expectations of some REBs were unclear. The issue of the inadequate ethical guidance of research includes the administrative mechanisms to ensure that research projects respect the principles of research ethics. According to those participants, the forms required for both researchers and REB members are increasingly long and numerous, and one participant noted that the standards to be met are sometimes outdated and disconnected from the reality of the field. Multicentre ethics review (by several REBs) was also critiqued by a participant as an inefficient method that encumbers the processes for reviewing research projects. Bureaucratization imposes an ever-increasing number of forms and ethics guidelines that actually hinder researchers’ ethical reflection on the issues at stake, leading the ethics review process to be perceived as purely bureaucratic in nature.

The ethical dimension and the ethical review of projects have become increasingly bureaucratized. (…) When I first started working (…) it was less bureaucratic, less strict then. I would say [there are now] tons of forms to fill out. Of course, we can’t do without it, it’s one of the ways of marking out ethics and ensuring that there are ethical considerations in research, but I wonder if it hasn’t become too bureaucratized, so that it’s become a kind of technical reflex to fill out these forms, and I don’t know if people really do ethical reflection as such anymore (participant 10). The fundamental structural issue, I would say, is the mismatch between the normative requirements and the real risks posed by the research, i.e., we have many, many requirements to meet; we have very long forms to fill out but the research projects we evaluate often pose few risks (participant 8). People [in vulnerable situations] were previously unable to participate because of overly strict research ethics rules that were to protect them, but in the end [these rules] did not protect them. There was a perverse effect, because in the end there was very little research done with these people and that’s why we have very few results, very little evidence [to support practices with these populations] so it didn’t improve the quality of services. (…) We all understand that we have to be careful with that, but when the research is not too risky, we say to ourselves that it would be good because for once a researcher who is interested in that population, because it is not a very popular population, it would be interesting to have results, but often we are blocked by the norms, and then we can’t accept [the project] (participant 2).

Moreover, as one participant noted, accessing ethics training can be a challenge.

There is no course on research ethics. […] Then, I find that it’s boring because you go through university and you come to do your research and you know how to do quantitative and qualitative research, but all the research ethics, where do you get this? I don’t really know (participant 13).

Yet, such training could provide relevant tools to resolve, to some extent, the ethical issues that commonly arise in research. That said, and as noted by many participants, many ethical issues in research are related to social injustices over which research actors have little influence.

Social Injustices

For many participants, notably researchers, the issues that concern social injustices are those related to power asymmetries, stigma, or issues of equity, diversity, and inclusion, i.e., social injustices related to people’s identities (Blais & Drolet, 2022 ). Participants reported experiencing or witnessing discrimination from peers, administration, or lab managers. Such oppression is sometimes cross-sectional and related to a person’s age, cultural background, gender or social status.

I have my African colleague who was quite successful when he arrived but had a backlash from colleagues in the department. I think it’s unconscious, nobody is overtly racist. But I have a young person right now who is the same, who has the same success, who got exactly the same early career award and I don’t see the same backlash. He’s just as happy with what he’s doing. It’s normal, they’re young and they have a lot of success starting out. So, I think there is discrimination. Is it because he is African? Is it because he is black? I think it’s on a subconscious level (participant 16).

Social injustices were experienced or reported by many participants, and included issues related to difficulties in obtaining grants or disseminating research results in one’s native language (i.e., even when there is official bilingualism) or being considered credible and fundable in research when one researcher is a woman.

If you do international research, there are things you can’t talk about (…). It is really a barrier to research to not be able to (…) address this question [i.e. the question of inequalities between men and women]. Women’s inequality is going to be addressed [but not within the country where the research takes place as if this inequality exists elsewhere but not here]. There are a lot of women working on inequality issues, doing work and it’s funny because I was talking to a young woman who works at Cairo University and she said to me: “Listen, I saw what you had written, you’re right. I’m willing to work on this but guarantee me a position at your university with a ticket to go”. So yes, there are still many barriers [for women in research] (participant 16).

Because of the varied contextual characteristics that intervene in their occurrence, these social injustices are also related to distributive injustices, as discussed by many participants.

Distributive Injustices

Although there are several views of distributive justice, a classical definition such as that of Aristotle ( 2012 ), describes distributive justice as consisting in distributing honours, wealth, and other social resources or benefits among the members of a community in proportion to their alleged merit. Justice, then, is about determining an equitable distribution of common goods. Contemporary theories of distributive justice are numerous and varied. Indeed, many authors (e.g., Fraser 2011 ; Mills, 2017 ; Sen, 2011 ; Young, 2011 ) have, since Rawls ( 1971 ), proposed different visions of how social burdens and benefits should be shared within a community to ensure equal respect, fairness, and distribution. In our study, what emerges from participants’ narratives is a definite concern for this type of justice. Women researchers, francophone researchers, early career researchers or researchers belonging to racialized groups all discussed inequities in the distribution of research grants and awards, and the extra work they need to do to somehow prove their worth. These inequities are related to how granting agencies determine which projects will be funded.

These situations make me work 2–3 times harder to prove myself and to show people in power that I have a place as a woman in research (participant 12). Number one: it’s conservative thinking. The older ones control what comes in. So, the younger people have to adapt or they don’t get funded (participant 14).

Whether it is discrimination against stigmatized or marginalized populations or interest in certain hot topics, granting agencies judge research projects according to criteria that are sometimes questionable, according to those participants. Faced with difficulties in obtaining funding for their projects, several strategies – some of which are unethical – are used by researchers in order to cope with these situations.

Sometimes there are subjects that everyone goes to, such as nanotechnology (…), artificial intelligence or (…) the therapeutic use of cannabis, which are very fashionable, and this is sometimes to the detriment of other research that is just as relevant, but which is (…), less sexy, less in the spirit of the time. (…) Sometimes this can lead to inequities in the funding of certain research sectors (participant 9). When we use our funds, we get them given to us, we pretty much say what we think we’re going to do with them, but things change… So, when these things change, sometimes it’s an ethical decision, but by force of circumstances I’m obliged to change the project a little bit (…). Is it ethical to make these changes or should I just let the money go because I couldn’t use it the way I said I would? (participant 3).

Moreover, these distributional injustices are not only linked to social injustices, but also epistemic injustices. Indeed, the way in which research honours and grants are distributed within the academic community depends on the epistemic authority of the researchers, which seems to vary notably according to their language of use, their age or their gender, but also to the research design used (inductive versus deductive), their decision to use (or not use) animals in research, or to conduct activist research.

Epistemic injustices

The philosopher Fricker ( 2007 ) conceptualized the notions of epistemic justice and injustice. Epistemic injustice refers to a form of social inequality that manifests itself in the access, recognition, and production of knowledge as well as the various forms of ignorance that arise (Godrie & Dos Santos, 2017 ). Addressing epistemic injustice necessitates acknowledging the iniquitous wrongs suffered by certain groups of socially stigmatized individuals who have been excluded from knowledge, thus limiting their abilities to interpret, understand, or be heard and account for their experiences. In this study, epistemic injustices were experienced or reported by some participants, notably those related to difficulties in obtaining grants or disseminating research results in one’s native language (i.e., even when there is official bilingualism) or being considered credible and fundable in research when a researcher is a woman or an early career researcher.

I have never sent a grant application to the federal government in English. I have always done it in French, even though I know that when you receive the review, you can see that reviewers didn’t understand anything because they are English-speaking. I didn’t want to get in the boat. It’s not my job to translate, because let’s be honest, I’m not as good in English as I am in French. So, I do them in my first language, which is the language I’m most used to. Then, technically at the administrative level, they are supposed to be able to do it, but they are not good in French. (…) Then, it’s a very big Canadian ethical issue, because basically there are technically two official languages, but Canada is not a bilingual country, it’s a country with two languages, either one or the other. (…) So I was not funded (participant 14).

Researchers who use inductive (or qualitative) methods observed that their projects are sometimes less well reviewed or understood, while research that adopts a hypothetical-deductive (or quantitative) or mixed methods design is better perceived, considered more credible and therefore more easily funded. Of course, regardless of whether a research project adopts an inductive, deductive or mixed-methods scientific design, or whether it deals with qualitative or quantitative data, it must respect a set of scientific criteria. A research project should achieve its objectives by using proven methods that, in the case of inductive research, are credible, reliable, and transferable or, in the case of deductive research, generalizable, objective, representative, and valid (Drolet & Ruest, accepted ). Participants discussing these issues noted that researchers who adopt a qualitative design or those who question the relevance of animal experimentation or are not militant have sometimes been unfairly devalued in their epistemic authority.

There is a mini war between quantitative versus qualitative methods, which I think is silly because science is a method. If you apply the method well, it doesn’t matter what the field is, it’s done well and it’s perfect ” (participant 14). There is also the issue of the place of animals in our lives, because for me, ethics is human ethics, but also animal ethics. Then, there is a great evolution in society on the role of the animal… with the new law that came out in Quebec on the fact that animals are sensitive beings. Then, with the rise of the vegan movement, [we must ask ourselves]: “Do animals still have a place in research?” That’s a big question and it also means that there are practices that need to evolve, but sometimes there’s a disconnection between what’s expected by research ethics boards versus what’s expected in the field (participant 15). In research today, we have more and more research that is militant from an ideological point of view. And so, we have researchers, because they defend values that seem important to them, we’ll talk for example about the fight for equality and social justice. They have pressure to defend a form of moral truth and have the impression that everyone thinks like them or should do so, because they are defending a moral truth. This is something that we see more and more, namely the lack of distance between ideology and science (participant 8).

The combination or intersectionality of these inequities, which seems to be characterized by a lack of ethical support and guidance, is experienced in the highly competitive and individualistic context of research; it provides therefore the perfect recipe for researchers to experience ethical distress.

Ethical distress

The concept of “ethical distress” refers to situations in which people know what they should do to act ethically, but encounter barriers, generally of an organizational or systemic nature, limiting their power to act according to their moral or ethical values (Drolet & Ruest, 2021 ; Jameton, 1984 ; Swisher et al., 2005 ). People then run the risk of finding themselves in a situation where they do not act as their ethical conscience dictates, which in the long term has the potential for exhaustion and distress. The examples reported by participants in this study point to the fact that researchers in particular may be experiencing significant ethical distress. This distress takes place in a context of extreme competition, constant injunctions to perform, and where administrative demands are increasingly numerous and complex to complete, while paradoxically, they lack the time to accomplish all their tasks and responsibilities. Added to these demands are a lack of resources (human, ethical, and financial), a lack of support and recognition, and interpersonal conflicts.

We are in an environment, an elite one, you are part of it, you know what it is: “publish or perish” is the motto. Grants, there is a high level of performance required, to do a lot, to publish, to supervise students, to supervise them well, so yes, it is clear that we are in an environment that is conducive to distress. (…). Overwork, definitely, can lead to distress and eventually to exhaustion. When you know that you should take the time to read the projects before sharing them, but you don’t have the time to do that because you have eight that came in the same day, and then you have others waiting… Then someone rings a bell and says: “ah but there, the protocol is a bit incomplete”. Oh yes, look at that, you’re right. You make up for it, but at the same time it’s a bit because we’re in a hurry, we don’t necessarily have the resources or are able to take the time to do things well from the start, we have to make up for it later. So yes, it can cause distress (participant 9). My organization wanted me to apply in English, and I said no, and everyone in the administration wanted me to apply in English, and I always said no. Some people said: “Listen, I give you the choice”, then some people said: “Listen, I agree with you, but if you’re not [submitting] in English, you won’t be funded”. Then the fact that I am young too, because very often they will look at the CV, they will not look at the project: “ah, his CV is not impressive, we will not finance him”. This is complete nonsense. The person is capable of doing the project, the project is fabulous: we fund the project. So, that happened, organizational barriers: that happened a lot. I was not eligible for Quebec research funds (…). I had big organizational barriers unfortunately (participant 14). At the time of my promotion, some colleagues were not happy with the type of research I was conducting. I learned – you learn this over time when you become friends with people after you enter the university – that someone was against me. He had another candidate in mind, and he was angry about the selection. I was under pressure for the first three years until my contract was renewed. I almost quit at one point, but another colleague told me, “No, stay, nothing will happen”. Nothing happened, but these issues kept me awake at night (participant 16).

This difficult context for many researchers affects not only the conduct of their own research, but also their participation in research. We faced this problem in our study, despite the use of multiple recruitment methods, including more than 200 emails – of which 191 were individual solicitations – sent to potential participants by the two research assistants. REB members and organizations overseeing or supporting research (n = 17) were also approached to see if some of their employees would consider participating. While it was relatively easy to recruit REB members and research ethics experts, our team received a high number of non-responses to emails (n = 175) and some refusals (n = 5), especially by researchers. The reasons given by those who replied were threefold: (a) fear of being easily identified should they take part in the research, (b) being overloaded and lacking time, and (c) the intrusive aspect of certain questions (i.e., “Have you experienced a burnout episode? If so, have you been followed up medically or psychologically?”). In light of these difficulties and concerns, some questions in the socio-demographic questionnaire were removed or modified. Talking about burnout in research remains a taboo for many researchers, which paradoxically can only contribute to the unresolved problem of unhealthy research environments.

Returning to the research question and objective

The question that prompted this research was: What are the ethical issues in research? The purpose of the study was to describe these issues from the perspective of researchers (from different disciplines), research ethics board (REB) members, and research ethics experts. The previous section provided a detailed portrait of the ethical issues experienced by different research stakeholders: these issues are numerous, diverse and were recounted by a range of stakeholders.

The results of the study are generally consistent with the literature. For example, as in our study, the literature discusses the lack of research integrity on the part of some researchers (Al-Hidabi et al., 2018 ; Swazey et al., 1993 ), the numerous conflicts of interest experienced in research (Williams-Jones et al., 2013 ), the issues of recruiting and obtaining the free and informed consent of research participants (Provencher et al., 2014 ; Keogh & Daly, 2009 ), the sometimes difficult relations between researchers and REBs (Drolet & Girard, 2020 ), the epistemological issues experienced in research (Drolet & Ruest, accepted; Sieber 2004 ), as well as the harmful academic context in which researchers evolve, insofar as this is linked to a culture of performance, an overload of work in a context of accountability (Berg & Seeber, 2016 ; FQPPU; 2019 ) that is conducive to ethical distress and even burnout.

If the results of the study are generally in line with those of previous publications on the subject, our findings also bring new elements to the discussion while complementing those already documented. In particular, our results highlight the role of systemic injustices – be they social, distributive or epistemic – within the environments in which research is carried out, at least in Canada. To summarize, the results of our study point to the fact that the relationships between researchers and research participants are likely still to raise worrying ethical issues, despite widely accepted research ethics norms and institutionalized review processes. Further, the context in which research is carried out is not only conducive to breaches of ethical norms and instances of misbehaviour or misconduct, but also likely to be significantly detrimental to the health and well-being of researchers, as well as research assistants. Another element that our research also highlighted is the instrumentalization and even exploitation of students and research assistants, which is another important and worrying social injustice given the inevitable power imbalances between students and researchers.

Moreover, in a context in which ethical issues are often discussed from a micro perspective, our study helps shed light on both the micro- and macro-level ethical dimensions of research (Bronfenbrenner, 1979 ; Glaser 1994 ). However, given that ethical issues in research are not only diverse, but also and above all complex, a broader perspective that encompasses the interplay between the micro and macro dimensions can enable a better understanding of these issues and thereby support the identification of the multiple factors that may be at their origin. Triangulating the perspectives of researchers with those of REB members and research ethics experts enabled us to bring these elements to light, and thus to step back from and critique the way that research is currently conducted. To this end, attention to socio-political elements such as the performance culture in academia or how research funds are distributed, and according to what explicit and implicit criteria, can contribute to identifying the sources of the ethical issues described above.

Contemporary culture characterized by the social acceleration

The German sociologist and philosopher Rosa (2010) argues that late modernity – that is, the period between the 1980s and today – is characterized by a phenomenon of social acceleration that causes various forms of alienation in our relationship to time, space, actions, things, others and ourselves. Rosa distinguishes three types of acceleration: technical acceleration , the acceleration of social changes and the acceleration of the rhythm of life . According to Rosa, social acceleration is the main problem of late modernity, in that the invisible social norm of doing more and faster to supposedly save time operates unchallenged at all levels of individual and collective life, as well as organizational and social life. Although we all, researchers and non-researchers alike, perceive this unspoken pressure to be ever more productive, the process of social acceleration as a new invisible social norm is our blind spot, a kind of tyrant over which we have little control. This conceptualization of the contemporary culture can help us to understand the context in which research is conducted (like other professional practices). To this end, Berg & Seeber ( 2016 ) invite faculty researchers to slow down in order to better reflect and, in the process, take care of their health and their relationships with their colleagues and students. Many women professors encourage their fellow researchers, especially young women researchers, to learn to “say No” in order to protect their mental and physical health and to remain in their academic careers (Allaire & Descheneux, 2022 ). These authors also remind us of the relevance of Kahneman’s ( 2012 ) work which demonstrates that it takes time to think analytically, thoroughly, and logically. Conversely, thinking quickly exposes humans to cognitive and implicit biases that then lead to errors in thinking (e.g., in the analysis of one’s own research data or in the evaluation of grant applications or student curriculum vitae). The phenomenon of social acceleration, which pushes the researcher to think faster and faster, is likely to lead to unethical bad science that can potentially harm humankind. In sum, Rosa’s invitation to contemporary critical theorists to seriously consider the problem of social acceleration is particularly insightful to better understand the ethical issues of research. It provides a lens through which to view the toxic context in which research is conducted today, and one that was shared by the participants in our study.

Clark & Sousa ( 2022 ) note, it is important that other criteria than the volume of researchers’ contributions be valued in research, notably quality. Ultimately, it is the value of the knowledge produced and its influence on the concrete lives of humans and other living beings that matters, not the quantity of publications. An interesting articulation of this view in research governance is seen in a change in practice by Australia’s national health research funder: they now restrict researchers to listing on their curriculum vitae only the top ten publications from the past ten years (rather than all of their publications), in order to evaluate the quality of contributions rather than their quantity. To create environments conducive to the development of quality research, it is important to challenge the phenomenon of social acceleration, which insidiously imposes a quantitative normativity that is both alienating and detrimental to the quality and ethical conduct of research. Based on our experience, we observe that the social norm of acceleration actively disfavours the conduct of empirical research on ethics in research. The fact is that researchers are so busy that it is almost impossible for them to find time to participate in such studies. Further, operating in highly competitive environments, while trying to respect the values and ethical principles of research, creates ethical paradoxes for members of the research community. According to Malherbe ( 1999 ), an ethical paradox is a situation where an individual is confronted by contradictory injunctions (i.e., do more, faster, and better). And eventually, ethical paradoxes lead individuals to situations of distress and burnout, or even to ethical failures (i.e., misbehaviour or misconduct) in the face of the impossibility of responding to contradictory injunctions.

Strengths and Limitations of the study

The triangulation of perceptions and experiences of different actors involved in research is a strength of our study. While there are many studies on the experiences of researchers, rarely are members of REBs and experts in research ethics given the space to discuss their views of what are ethical issues. Giving each of these stakeholders a voice and comparing their different points of view helped shed a different and complementary light on the ethical issues that occur in research. That said, it would have been helpful to also give more space to issues experienced by students or research assistants, as the relationships between researchers and research assistants are at times very worrying, as noted by a participant, and much work still needs to be done to eliminate the exploitative situations that seem to prevail in certain research settings. In addition, no Indigenous or gender diverse researchers participated in the study. Given the ethical issues and systemic injustices that many people from these groups face in Canada (Drolet & Goulet, 2018 ; Nicole & Drolet, in press ), research that gives voice to these researchers would be relevant and contribute to knowledge development, and hopefully also to change in research culture.

Further, although most of the ethical issues discussed in this article may be transferable to the realities experienced by researchers in other countries, the epistemic injustice reported by Francophone researchers who persist in doing research in French in Canada – which is an officially bilingual country but in practice is predominantly English – is likely specific to the Canadian reality. In addition, and as mentioned above, recruitment proved exceedingly difficult, particularly amongst researchers. Despite this difficulty, we obtained data saturation for all but two themes – i.e., exploitation of students and ethical issues of research that uses animals. It follows that further empirical research is needed to improve our understanding of these specific issues, as they may diverge to some extent from those documented here and will likely vary across countries and academic research contexts.

Conclusions

This study, which gave voice to researchers, REB members, and ethics experts, reveals that the ethical issues in research are related to several problematic elements as power imbalances and authority relations. Researchers and research assistants are subject to external pressures that give rise to integrity issues, among others ethical issues. Moreover, the current context of social acceleration influences the definition of the performance indicators valued in academic institutions and has led their members to face several ethical issues, including social, distributive, and epistemic injustices, at different steps of the research process. In this study, ten categories of ethical issues were identified, described and illustrated: (1) research integrity, (2) conflicts of interest, (3) respect for research participants, (4) lack of supervision and power imbalances, (5) individualism and performance, (6) inadequate ethical guidance, (7) social injustices, (8) distributive injustices, (9) epistemic injustices, and (10) ethical distress. The triangulation of the perspectives of different members (i.e., researchers from different disciplines, REB members, research ethics experts, and one research assistant) involved in the research process made it possible to lift the veil on some of these ethical issues. Further, it enabled the identification of additional ethical issues, especially systemic injustices experienced in research. To our knowledge, this is the first time that these injustices (social, distributive, and epistemic injustices) have been clearly identified.

Finally, this study brought to the fore several problematic elements that are important to address if the research community is to develop and implement the solutions needed to resolve the diverse and transversal ethical issues that arise in research institutions. A good starting point is the rejection of the corollary norms of “publish or perish” and “do more, faster, and better” and their replacement with “publish quality instead of quantity”, which necessarily entails “do less, slower, and better”. It is also important to pay more attention to the systemic injustices within which researchers work, because these have the potential to significantly harm the academic careers of many researchers, including women researchers, early career researchers, and those belonging to racialized groups as well as the health, well-being, and respect of students and research participants.

Al-Hidabi, Abdulmalek, M. D., & The, P. L. (2018). Multiple Publications: The Main Reason for the Retraction of Papers in Computer Science. In K. Arai, S. Kapoor, & R. Bhatia (eds), Future of Information and Communication Conference (FICC): Advances in Information and Communication, Advances in Intelligent Systems and Computing (AISC), Springer, vol. 886, pp. 511–526

Allaire, S., & Deschenaux, F. (2022). Récits de professeurs d’université à mi-carrière. Si c’était à refaire… . Presses de l’Université du Québec

Aristotle (2012). Aristotle’s Nicomachean Ethics . Chicago: The University of Chicago Press

Google Scholar  

Bahn, S. (2012). Keeping Academic Field Researchers Safe: Ethical Safeguards. Journal of Academic Ethics , 10 , 83–91. https://doi.org/10.1007/s10805-012-9159-2

Article   Google Scholar  

Balk, D. E. (1995). Bereavement Research Using Control Groups: Ethical Obligations and Questions. Death Studies , 19 , 123–138

Beauchemin, É., Côté, L. P., Drolet, M. J., & Williams-Jones, B. (2021). Conceptualizing Ethical Issues in the Conduct of Research: Results from a Critical and Systematic Literature Review. Journal of Academic Ethics , Early Online. https://doi.org/10.1007/s10805-021-09411-7

Berg, M., & Seeber, B. K. (2016). The Slow Professor . University of Toronto Press

Birchley, G., Huxtable, R., Murtagh, M., Meulen, R. T., Flach, P., & Gooberman-Hill, R. (2017). Smart homes, private homes? An empirical study of technology researchers’ perceptions of ethical issues in developing smart-home health technologies. BMC Medical Ethics , 18 (23), 1–13. https://doi.org/10.1186/s12910-017-0183-z

Blais, J., & Drolet, M. J. (2022). Les injustices sociales vécues en camp de réfugiés: les comprendre pour mieux intervenir auprès de personnes ayant séjourné dans un camp de réfugiés. Recueil annuel belge d’ergothérapie , 14, 37–48

Bogdan, R. C., & Biklen, S. K. (2006). Qualitative research in education: An introduction to theory and methods . Allyn & Bacon

Bouffard, C. (2000). Le développement des pratiques de la génétique médicale et la construction des normes bioéthiques. Anthropologie et Sociétés , 24 (2), 73–90. https://doi.org/10.7202/015650ar

Bronfenbrenner, U. (1979). The Ecology of Human development. Experiments by nature and design . Harvard University Press

Bruhn, J. G., Zajac, G., Al-Kazemi, A. A., & Prescott, L. D. (2002). Moral positions and academic conduct: Parameters of tolerance for ethics failure. Journal of Higher Education , 73 (4), 461–493. https://doi.org/10.1353/jhe.2002.0033

Clark, A., & Sousa (2022). It’s time to end Canada’s obsession with research quantity. University Affairs/Affaires universitaires , February 14th. https://www.universityaffairs.ca/career-advice/effective-successfull-happy-academic/its-time-to-end-canadas-obsession-with-research-quantity/?utm_source=University+Affairs+e-newsletter&utm_campaign=276a847f 70-EMAIL_CAMPAIGN_2022_02_16&utm_medium=email&utm_term=0_314bc2ee29-276a847f70-425259989

Colnerud, G. (2015). Ethical dilemmas in research in relation to ethical review: An empirical study. Research Ethics , 10 (4), 238–253. DOI: https://doi.org/10.1177/1747016114552339

Davison, J. (2004). Dilemmas in Research: Issues of Vulnerability and Disempowerment for the Social Workers/Researcher. Journal of Social Work Practice , 18 (3), 379–393. https://doi.org/10.1080/0265053042000314447

DePoy, E., & Gitlin, L. N. (2010). Introduction to Research . St. Louis: Elsevier Mosby

Drolet, M. J., & Goulet, M. (2018). Travailler avec des patients autochtones du Canada ? Perceptions d’ergothérapeutes du Québec des enjeux éthiques de cette pratique. Recueil annuel belge francophone d’ergothérapie , 10 , 25–56

Drolet, M. J., & Girard, K. (2020). Les enjeux éthiques de la recherche en ergothérapie: un portrait préoccupant. Revue canadienne de bioéthique , 3 (3), 21–40. https://doi.org/10.7202/1073779ar

Drolet, M. J., Girard, K., & Gaudet, R. (2020). Les enjeux éthiques de l’enseignement en ergothérapie: des injustices au sein des départements universitaires. Revue canadienne de bioéthique , 3 (1), 22–36. https://www.erudit.org/fr/revues/bioethics/2020-v3-n1-bioethics05237/1068761ar/

Drolet, M. J., & Maclure, J. (2016). Les enjeux éthiques de la pratique de l’ergothérapie: perceptions d’ergothérapeutes. Revue Approches inductives , 3 (2), 166–196

Drolet, M. J., Pinard, C., & Gaudet, R. (2017). Les enjeux éthiques de la pratique privée: des ergothérapeutes du Québec lancent un cri d’alarme. Ethica – Revue interdisciplinaire de recherche en éthique , 21 (2), 173–209

Drolet, M. J., & Ruest, M. (2021). De l’éthique à l’ergothérapie: un cadre théorique et une méthode pour soutenir la pratique professionnelle . Québec: Presses de l’Université du Québec

Book   Google Scholar  

Drolet, M. J., & Ruest, M. (accepted). Quels sont les enjeux éthiques soulevés par la recherche scientifique? In M. Lalancette & J. Luckerhoff (dir). Initiation au travail intellectuel et à la recherche . Québec: Presses de l’Université du Québec, 18 p

Drolet, M. J., Sauvageau, A., Baril, N., & Gaudet, R. (2019). Les enjeux éthiques de la formation clinique en ergothérapie. Revue Approches inductives , 6 (1), 148–179. https://www.erudit.org/fr/revues/approchesind/2019-v6-n1-approchesind04618/1060048ar/

Fédération québécoise des professeures et des professeurs d’université (FQPPU). (2019). Enquête nationale sur la surcharge administrative du corps professoral universitaire québécois. Principaux résultats et pistes d’action . Montréal: FQPPU

Fortin, M. H. (2010). Fondements et étapes du processus de recherche. Méthodes quantitatives et qualitatives . Montréal, QC: Chenelière éducation

Fraser, D. M. (1997). Ethical dilemmas and practical problems for the practitioner researcher. Educational Action Research , 5 (1), 161–171

Fraser, N. (2011). Qu’est-ce que la justice sociale? Reconnaissance et redistribution . La Découverte

Fricker, M. (2007). Epistemic Injustice: Power and the Ethics of Knowing . Oxford University Press

Giorgi, A. (1997). De la méthode phénoménologique utilisée comme mode de recherche qualitative en sciences humaines: théories, pratique et évaluation. In J. Poupart, L. H. Groulx, J. P. Deslauriers, et al. (Eds.), La recherche qualitative: enjeux épistémologiques et méthodologiques (pp. 341–364). Boucherville, QC: Gaëtan Morin

Giorgini, V., Mecca, J. T., Gibson, C., Medeiros, K., Mumford, M. D., Connelly, S., & Devenport, L. D. (2016). Researcher Perceptions of Ethical Guidelines and Codes of Conduct. Accountability in Research , 22 (3), 123–138. https://doi.org/10.1080/08989621.2014.955607

Glaser, J. W. (1994). Three realms of ethics: Individual, institutional, societal. Theoretical model and case studies . Kansas Cuty, Sheed & Ward

Godrie, B., & Dos Santos, M. (2017). Présentation: inégalités sociales, production des savoirs et de l’ignorance. Sociologie et sociétés , 49 (1), 7. https://doi.org/10.7202/1042804ar

Hammell, K. W., Carpenter, C., & Dyck, I. (2000). Using Qualitative Research: A Practical Introduction for Occupational and Physical Therapists . Edinburgh: Churchill Livingstone

Henderson, M., Johnson, N. F., & Auld, G. (2013). Silences of ethical practice: dilemmas for researchers using social media. Educational Research and Evaluation , 19 (6), 546–560. https://doi.org/10.1080/13803611.2013.805656

Husserl, E. (1970). The crisis of European sciences and transcendental phenomenology . Evanston, IL: Northwestern University Press

Husserl, E. (1999). The train of thoughts in the lectures. In E. C. Polifroni, & M. Welch (Eds.), Perspectives on Philosophy of Science in Nursing . Philadelphia, PA: Lippincott. 247 – 62. 43

Hunt, S. D., Chonko, L. B., & Wilcox, J. B. (1984). Ethical problems of marketing researchers. Journal of Marketing Research , 21 , 309–324

Hunt, M. R., & Carnevale, F. A. (2011). Moral experience: A framework for bioethics research. Journal of Medical Ethics , 37 (11), 658–662. https://doi.org/10.1136/jme.2010.039008

Jameton, A. (1984). Nursing practice: The ethical issues . Englewood Cliffs, Prentice-Hall

Jarvis, K. (2017). Dilemmas in International Research and the Value of Practical Wisdom. Developing World Bioethics , 17 (1), 50–58. DOI: https://doi.org/10.1111/dewb.12121

Kahneman, D. (2012). Système 1, système 2: les deux vitesses de la pensée . Paris: Flammarion

Keogh, B., & Daly, L. (2009). The ethics of conducting research with mental health service users. British Journal of Nursing , 18 (5), 277–281. https://doi.org/10.12968/bjon.2009.18.5.40539

Lierville, A. L., Grou, C., & Pelletier, J. F. (2015). Enjeux éthiques potentiels liés aux partenariats patients en psychiatrie: État de situation à l’Institut universitaire en santé mentale de Montréal. Santé mentale au Québec , 40 (1), 119–134

Lynöe, N., Sandlund, M., & Jacobsson, L. (1999). Research ethics committees: A comparative study of assessment of ethical dilemmas. Scandinavian Journal of Public Health , 27 (2), 152–159

Malherbe, J. F. (1999). Compromis, dilemmes et paradoxes en éthique clinique . Anjou: Éditions Fides

McGinn, R. (2013). Discernment and denial: Nanotechnology researchers’ recognition of ethical responsibilities related to their work. NanoEthics , 7 , 93–105. https://doi.org/10.1007/s11569-013-0174-6

Mills, C. W. (2017). Black Rights / White rongs. The Critique of Racial Liberalism . Oxford University Press

Miyazaki, A. D., & Taylor, K. A. (2008). Researcher interaction biases and business ethics research: Respondent reactions to researcher characteristics. Journal of Business Ethics , 81 (4), 779–795. DOI https://doi.org/10.1007/s10551-007-9547-5

Mondain, N., & Bologo, E. (2009). L’intentionnalité du chercheur dans ses pratiques de production des connaissances: les enjeux soulevés par la construction des données en démographie et santé en Afrique. Cahiers de recherche sociologique , 48 , 175–204. https://doi.org/10.7202/039772ar

Nicole, M., & Drolet, M. J. (in press). Fitting transphobia and cisgenderism in occupational therapy, Occupational Therapy Now

Pope, K. S., & Vetter, V. A. (1992). Ethical dilemmas encountered by members of the American Psychological Association: A national survey. The American Psychologist , 47 (3), 397–411

Provencher, V., Mortenson, W. B., Tanguay-Garneau, L., Bélanger, K., & Dagenais, M. (2014). Challenges and strategies pertaining to recruitment and retention of frail elderly in research studies: A systematic review. Archives of Gerontology and Geriatrics , 59 (1), 18–24. https://doi.org/10.1016/j.archger.2014.03.006

Rawls, J. (1971). A Theory of Justice . Harvard University Press

Resnik, D. B., & Elliott, K. C. (2016). The Ethical Challenges of Socially Responsible Science. Accountability in Research , 23 (1), 31–46. https://doi.org/10.1080/08989621.2014.1002608

Rosa, H. (2010). Accélération et aliénation. Vers une théorie critique de la modernité tardive . Paris, Découverte

Sen, A. K. (2011). The Idea of Justice . The Belknap Press of Harvard University Press

Sen, A. K. (1995). Inegality Reexaminated . Oxford University Press

Sieber, J. E. (2004). Empirical Research on Research Ethics. Ethics & Behavior , 14 (4), 397–412. https://doi.org/10.1207/s15327019eb1404_9

Sigmon, S. T. (1995). Ethical practices and beliefs of psychopathology researchers. Ethics & Behavior , 5 (4), 295–309

Swazey, J. P., Anderson, M. S., & Lewis, K. S. (1993). Ethical Problems in Academic Research. American Scientist , 81 (6), 542–553

Swisher, L. L., Arsalanian, L. E., & Davis, C. M. (2005). The realm-individual-process-situation (RIPS) model of ethical decision-making. HPA Resource , 5 (3), 3–8. https://www-s3-live.kent.edu/s3fs-root/s3fs-public/file/RIPS_DecisionMaking_0.pdf

Tri-Council Policy Statement (TCPS2) (2018). Ethical Conduct for Research Involving Humans . Government of Canada, Secretariat on Responsible Conduct of Research. https://ethics.gc.ca/eng/documents/tcps2-2018-en-interactive-final.pdf

Thomas, S. P., & Pollio, H. R. (2002). Listening to Patients: A Phenomenological Approach to Nursing Research and Practice . New York: Springer Publishing Company

Wiegand, D. L., & Funk, M. (2012). Consequences of clinical situations that cause critical care nurses to experience moral distress. Nursing Ethics , 19 (4), 479–487. DOI https://doi.org/10.1177/0969733011429342

Williams-Jones, B., Potvin, M. J., Mathieu, G., & Smith, E. (2013). Barriers to research on research ethics review and conflicts of interest. IRB: Ethics & Human Research , 35 (5), 14–20

Young, I. M. (2011). Justice and the Politics of difference . Princeton University Press

Download references

Acknowledgements

The team warmly thanks the participants who took part in the research and who made this study possible. Marie-Josée Drolet thanks the five research assistants who participated in the data collection and analysis: Julie-Claude Leblanc, Élie Beauchemin, Pénéloppe Bernier, Louis-Pierre Côté, and Eugénie Rose-Derouin, all students at the Université du Québec à Trois-Rivières (UQTR), two of whom were active in the writing of this article. MJ Drolet and Bryn Williams-Jones also acknowledge the financial contribution of the Social Sciences and Humanities Research Council of Canada (SSHRC), which supported this research through a grant. We would also like to thank the reviewers of this article who helped us improve it, especially by clarifying and refining our ideas.

Author information

Authors and affiliations.

Department of Occupational Therapy (OT), Université du Québec à Trois-Rivières (UQTR), Trois-Rivières (Québec), Canada

Marie-Josée Drolet

Bachelor OT program, Université du Québec à Trois-Rivières (UQTR), Trois-Rivières (Québec), Canada

Eugénie Rose-Derouin, Julie-Claude Leblanc & Mélanie Ruest

Department of Social and Preventive Medicine, School of Public Health, Université de Montréal, Montréal (Québec), Canada

Bryn Williams-Jones

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Marie-Josée Drolet .

Ethics declarations

Competing interests and funding.

As noted in the Acknowledgements, this research was supported financially by the Social Sciences and Humanities Research Council of Canada (SSHRC).

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Drolet, MJ., Rose-Derouin, E., Leblanc, JC. et al. Ethical Issues in Research: Perceptions of Researchers, Research Ethics Board Members and Research Ethics Experts. J Acad Ethics 21 , 269–292 (2023). https://doi.org/10.1007/s10805-022-09455-3

Download citation

Received : 24 March 2022

Revised : 13 July 2022

Accepted : 13 July 2022

Published : 12 August 2022

Issue Date : June 2023

DOI : https://doi.org/10.1007/s10805-022-09455-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Researchers
  • Research Ethics Board Members
  • Research Ethics experts
  • Find a journal
  • Publish with us
  • Track your research

observationhobbies.com logo

Understanding Specimen Sharing Ethics: Informed Consent, History, and Global Challenges

James Oliver

Are you aware of the ethical complexities surrounding the sharing of biological specimens in research?

In today’s interconnected world, where specimen sharing is crucial for scientific progress, understanding the ethical landscape is essential. It’s not just about following the rules; it’s about respecting the rights of individuals and cultures while fostering trust and transparency in research.

Well, we’ll be going over:

  • What are the key ethical principles that guide specimen sharing, such as informed consent and ownership?
  • How have historical events shaped the current ethical standards in specimen sharing?
  • What are the challenges, including commercialization and cultural sensitivity, that researchers must navigate to ensure ethical practices?

Let’s dive in.

Specimen sharing ethics demand transparency, respect for participants, and adherence to legal standards. Key principles include informed consent, recognizing ownership rights, and cultural sensitivity. Historical cases like the HeLa cells highlight ethical challenges, while modern regulations and guidelines ensure responsible practices. Prioritizing these ethical considerations fosters trust and protects the integrity of scientific research.

Overview Of Specimen Sharing Ethics

Specimen sharing ethics encompass several critical considerations when collecting, storing, and utilizing biological samples for research. You must prioritize informed consent from participants. This means ensuring they understand the purpose of the research, associated risks and benefits, and their right to withdraw consent at any time.

Another important aspect is ownership and control . Many participants feel a sense of ownership over their biological samples. It’s vital to clarify that samples are donated for research purposes, fostering transparency about access and use.

Additionally, consider privacy concerns related to specimen sharing. Safeguarding personal data linked to samples protects participant identities and maintains trust in the research process.

Lastly, address equitable access to specimens across diverse populations. Ensuring that various communities benefit from research findings promotes inclusivity and fairness in scientific advancement.

By focusing on these ethical dimensions—consent, ownership, privacy, and equity—you contribute significantly to responsible specimen sharing practices that respect individual rights while advancing medical knowledge.

Historical Context

research ethical problems

Specimen sharing ethics has evolved significantly over centuries, reflecting changing societal values and scientific practices. Key events in history illustrate the foundations of ethical considerations in research involving biological samples.

Evolution Of Specimen Sharing

Early experimentation, such as Mughal emperor Akbar the Great’s study on language acquisition in mute environments during the 1500s, sparked interest in ethical dilemmas surrounding scientific inquiry. The establishment of the Royal Society in 1662 further advanced specimen sharing by introducing peer review processes, fostering a culture of collaboration while emphasizing ethical accountability.

Landmark Cases And Decisions

The Tuskegee Syphilis Study (1932-1972) represents a crucial turning point for specimen sharing ethics. This unethical study highlighted significant issues regarding informed consent and exploitation of marginalized populations. It led to stricter regulations and guidelines governing human subject research, ensuring that ethical standards maintain participant rights and welfare at the forefront of scientific advancement.

Ethical Principles In Specimen Sharing

research ethical problems

Specimen sharing involves several ethical principles that ensure the rights and welfare of participants are protected. Key considerations include informed consent, ownership, and custodianship.

Informed Consent

Informed consent is essential in specimen sharing. You must provide potential research subjects with comprehensive information about the study’s purpose, methods, risks, and benefits. This process ensures that participants understand what they’re agreeing to when contributing their biospecimens. Voluntariness is crucial; participants should make decisions free from coercion or undue influence.

Ownership And Custodianship

Ownership and custodianship address who holds legal rights over biospecimens after collection. Clear agreements define the responsibilities of researchers regarding sample use and storage. You should recognize that participants retain certain rights over their specimens, emphasizing ethical stewardship in managing these valuable resources for research purposes.

Legal Considerations

research ethical problems

Understanding legal considerations in specimen sharing ethics ensures compliance with regulations that protect participants and their biological samples.

National Regulations

Federal law mandates institutions to have a Federalwide Assurance (FWA) for human subjects research. This assurance guarantees adherence to ethical standards outlined in the Common Rule. The Belmont Report further influences national regulations by establishing principles such as respect for persons, beneficence, and justice, which guide ethical conduct in research involving specimens.

International Guidelines

International guidelines supplement national laws by promoting ethical practices across borders. Organizations like the World Health Organization emphasize informed consent and equitable distribution of benefits from shared specimens. Adhering to these guidelines fosters trust and accountability among researchers while protecting the rights of individuals involved in global specimen sharing initiatives.

Challenges And Controversies

research ethical problems

Specimen sharing ethics face significant challenges and controversies, particularly in commercialization and cultural sensitivity. Understanding these issues is crucial for maintaining ethical standards in research.

Commercialization Of Specimens

The HeLa cells controversy exemplifies the ethical dilemmas surrounding the commercialization of human biospecimens. Derived from Henrietta Lacks, these cells were utilized widely in research without her family’s consent. This case underscores the importance of informed consent when commercializing biological materials to prevent exploitation and ensure respect for individuals’ rights.

Cultural Sensitivity

Cultural sensitivity plays a vital role in specimen sharing ethics. Varied cultural perspectives on virtues and ethical principles can lead to disagreements over practices in specimen sharing. Researchers must appreciate these differences to foster trust and cooperation among diverse communities, ensuring that all voices are considered in decision-making processes related to biological samples.

Navigating the complexities of specimen sharing ethics requires a commitment to transparency and respect for individual rights. You must prioritize informed consent and ensure that participants are fully aware of how their biological samples will be used.

Cultural sensitivity is paramount in fostering trust, as diverse perspectives can shape ethical considerations. By adhering to established guidelines and legal frameworks, you contribute to a more equitable research environment.

Embracing these principles not only protects participants but also enhances the integrity of scientific inquiry. Your awareness and advocacy for ethical practices can lead to more responsible and respectful specimen sharing in future research endeavors .

James Oliver

Meet James Oliver, the founder of this website, who started it in 2024. James is a passionate enthusiast of observation hobbies, and his vision led to the creation of this platform. He aimed to provide a welcoming space for individuals interested in astronomy, geology, meteorology, people, planes, and spacecraft watching.

Similar Posts

Field Notebook Essentials: Top Tips for Effective Fieldwork Documentation

Field Notebook Essentials: Top Tips for Effective Fieldwork Documentation

Geode Value: Tips for Assessing Worth & Authenticity

Geode Value: Tips for Assessing Worth & Authenticity

Louisiana Rockhounding Sites in 2024: Best Spots & Treasures

Louisiana Rockhounding Sites in 2024: Best Spots & Treasures

Mastering Rock Sieve Techniques: Innovations and Trends for Geology and Construction Efficiency

Mastering Rock Sieve Techniques: Innovations and Trends for Geology and Construction Efficiency

Mastering Color Mineral Identification: Tips, Techniques, and Challenges to Consider

Mastering Color Mineral Identification: Tips, Techniques, and Challenges to Consider

Chromite Prices: Worth, Quality & Buying Guide

Chromite Prices: Worth, Quality & Buying Guide

Popular Searches

  • Commencement
  • Securing Our Future
  • Marquette 2031

Arts & Sciences

Researching the promise, perils and ethical dimensions of technology’s relationship with mental health

From developing peer support apps to investigating social media users’ perceptions of mental health and offering ethical leadership about privacy concerns, faculty and students in the Klingler College of Arts and Sciences are finding insights at this intersection.

  • By Claire Curry
  • August 22, 2024
  • 6 min. read

Arts and Sciences Mental Health Tech social ladder climber

With nearly one in five U.S. adults living with mental illness, communities across the nation are struggling to meet the care and treatment needs of their residents, especially the most vulnerable. Faculty members and students in the Klingler College of Arts and Sciences are exploring the role technology may play in this area. Can it help improve diagnoses, access to care or peer support, and even flag individuals who are experiencing serious mental health crises? Could technology and social media promote openness and reduce the stigma surrounding mental health, or might they trivialize or glamorize serious conditions? Across the college, questions like these are driving vital discussions and research leadership on the ethical dimensions involved when mental health meets technology. 

Peer-to-peer mental health support — in an app

One team of researchers is focusing on U.S. military veterans who are known to be at higher risk for mental health disorders. According to the National Institute of Mental Health, more than 1.7 million veterans receive treatment at VA mental health specialty programs, and organizations like Dryhootch, a nonprofit with coffeehouses and resource centers for military veterans in Milwaukee and Madison, are aiding veterans who are struggling by providing peer support. 

“Veterans don’t often willingly seek out assistance from health care professionals about anything related to mental health and PTSD,” says Dr. Praveen Madiraju, professor of computer science. “They feel more at ease and open when talking to their peers. So Dryhootch started that program where veterans can go get a coffee and have that really cool, no-pressure atmosphere where they can sit and chat with other veterans.” 

“Marquette was brought in to offer technological expertise — to help bridge the gap between trauma and peer mentorship.” Dr. Praveen Madiraju

Leveraging that success, Marquette faculty, working in partnership with Dryhootch, have co-created a telehealth app called BattlePeer that brings the support of peers — akin to the “battle buddies” who had their backs in wartime — directly to veterans on their mobile phones.  

“Marquette was brought in to offer technological expertise — to help bridge the gap between trauma and peer mentorship,” Madiraju says. Marquette faculty, including Madiraju, and Dr. Iqbal Ahamed, Wehr Professor of Computer Science, collaborated with Dryhootch founder Bob Curry, a combat veteran, and Dr. Zeno Franco, associate professor at the Medical College of Wisconsin and psychologist at the U.S. Department of Veteran Affairs, to develop the app that matches veteran mentors with mentees, sends weekly check-ins to assess mental health and offers private and group chat features. 

The team continues to refine the app and plans to expand its use to support first responders and people with cancer and other chronic diseases. “Sometimes you just need a bridge through whatever you’re going through,” Curry says. “This technology can be the bridge that gets you through to the other side.” 

So far, the app has put peer support in the hands of hundreds and has the potential to reach many more, thanks to a recent licensing agreement that now makes BattlePeer available in Apple and Google app stores. 

“This has been more than a decade in the making,” says Madiraju. “With the licensing, we have a massive scale advantage. Potentially, [BattlePeer] can reach hundreds of thousands of veterans across the nation.”  

Online storytelling communities help new mothers

Dr. Sabirat Rubya, Northwestern Mutual Data Science Institute Assistant Professor, who contributed to developing BattlePeer, is also exploring how technology can support another at-risk population: new and expecting moms. According to Rubya, one woman in seven encounters postpartum depression, and many turn to online communities for information and advice.  

“Think of it as a curated space for sharing and finding support through powerful stories.” Dr. Sabirat Rubya

Women are at times reluctant to talk about their challenges as soon-to-be and new parents, even though this type of information sharing offers valuable emotional support, says Rubya. Many women aren’t sharing their problems directly with one another because of the stigma around mental health issues related to pregnancy and new motherhood. 

Rubya and graduate student Farhat Tasnim Progga analyzed three online venues — Reddit, What to Expect and BabyCenter — and concluded that online storytelling is an effective way to foster support for perinatal mental health. Their findings inspired them to develop “Mom Stories,” a web-based application where women can find stories on a range of topics of interest to new and expecting moms, from depression and baby blues to breastfeeding and newborn health. “Think of it as a curated space for sharing and finding support through powerful stories,” Rubya says about the application, which is set to launch in late 2024. 

Risks and ethical questions for data use 

“Is there not an ethical obligation to use every tool, every piece of data at their disposal, to attempt to save lives and help improve public health outcomes?” Dr. Michael Zimmer

While technology is making inroads in mental health research and treatment, health care professionals and scholars, including those at Marquette, are identifying ethical issues raised by these applications of technology. Collecting and accessing large data sets to advance research may compete with the goals of protecting patient privacy, for example. And deploying AI bots to provide real-time medical advice risks overlooking how biased and insufficiently trained the bots may be. 

Dr. Michael Zimmer, director of the Center for Data, Ethics, and Society at Marquette and professor and vice chair of the Department of Computer Science, shared his thoughts on these ethical tensions in a June webinar hosted by the Center for Suicide Research and Prevention, joined by experts from Harvard Medical School and Northwestern University. 

In his presentation, Zimmer pointed to work being done at the Smoller Laboratory at Massachusetts General Hospital, where researchers are using large data sets to help develop suicide risk prediction models. The researchers built a sound predictive model, he says, and are now exploring ways to enhance it by tapping public data sources. While the ultimate goal is to improve health outcomes and save lives, using public data for this purpose raises ethical questions.  

Public data and prediction models — a step too far?

“Imagine you had access to a hundred thousand people’s health records,” says Zimmer, who served as a consultant on the project. “You knew which 10,000 had made a suicide attempt at some point and you started looking at their records to see what was unique compared to everyone else.” Such records might reveal, for example, whether an individual had an interaction with law enforcement or a bankruptcy — information that Zimmer says could lead to “unintended consequences.”

Using such data in the prediction model could lead to the presumption that those who had an interaction with law enforcement or experienced a bankruptcy are at risk of suicide — even though many would not be. And reaching out to those “at risk” individuals the model identifies based on such criteria could possibly help avert tragedies, while also potentially representing yet another breach of privacy.

“In the health setting, there’s a different kind of ethical calculus at play because these are folks who are trying to find ways to do good things with technology,” he explains. “Is there not an ethical obligation to use every tool, every piece of data at their disposal, to attempt to save lives and help improve public health outcomes?” 

At the same time, it’s critical to look beyond how valuable the data is and recognize that the facts and figures represent real people and, often, vulnerable populations. “Things are moving quickly, but in the mental health space, everyone recognizes there are risks of moving too fast,” Zimmer says. “If we want to do good, we have to get this right.” 

Student researchers investigate how social media users perceive mental health

In addition to faculty, Marquette students and alumni are conducting research on technology and mental health. Psychology major Iza Guzek, Arts ’23, was part of a team under the guidance of Dr. Stephen Saunders, professor of psychology, that investigated whether social media has a larger role in helping students open up about their mental health — or leads to students trivializing, or even glorifying, mental illness. The students surveyed more than a hundred 18- to 28-year-olds to learn how users of TikTok and Instagram perceive people posting on these venues about mental illness. The majority of respondents described them using terms such as “admirable,” “cool,” “brave” and “strong.” 

The increasing use of social media to discuss mental health issues has helped to reduce stigma, Saunders and Guzek agree, but the pendulum may have swung a bit too far. “From our findings, we were glad to see that people disagree with the notion that the individuals [posting about mental health issues] come off as ‘scary’ or ‘dangerous,’” Guzek explains. “This shows that they do not view them in the typical ‘stigmatizing way’ so to speak. But what was surprising, yet supporting our hypothesis, was how a lot of participants strongly agreed to mental illness being portrayed as something ‘cool’ or ‘admirable.’”  Early results suggest that this is particularly true of the social media platform TikTok.  

“Getting students involved also shows them, hopefully, that anything about which they are curious can be studied.” Dr. Stephen Saunders

The team, which included Shea O’Connor, Arts ’24, and Christina Schmidt, Arts ’23, presented its research poster at the Wisconsin Psychological Association and the Marquette Undergraduate Research Symposium and has continued work on the project and corresponding paper. The goal is to present it at the Association for Psychological Science for the 2024 Global Psychological Science Summit in October. Guzek is committed to carrying the research forward and building upon it in her future career. 

“At Marquette, we are taught cura personalis , which means care for the whole person,” Guzek says. “We are advised to Be The Difference. Research such as this is one of the ways I can care for others and make a difference. Even after graduating, I feel it is my mission.” 

The Guzek-O’Connor-Schmidt team was one of four that conducted independent research under Saunders’ mentorship last year.  

Another group studied the associations of prosocial and antisocial behavior with mental health both on and offline. Conducted by Liam Pyne, ’24, and Pat Swanson, a rising junior psychology and economics major, the study examined the social media behaviors and reactions of 275 survey participants and concluded that prosocial behavior leads to increases in mental health and self-esteem, especially among people who know each other, while antisocial behavior leads to negative feelings — especially among strangers online.  

Saunders says that in addition to learning about the scientific process, students learn about themselves and what they may want to pursue as a career. “Getting students involved also shows them, hopefully, that anything about which they are curious can be studied.” 

Related Articles

research ethical problems

Stories at MU

Getting to Know Keyes Dean of the College of Business Administration Andrew DeGuire 

research ethical problems

‘Sewing’ seeds in a new home 

research ethical problems

Reaching higher for education 

  • Clinical trials and evidence

research ethical problems

Khunatorn - stock.adobe.com

Ethical Guidelines, Emerging Regulations in Octopus Research

Understanding octopus research guidelines is essential for ethical healthcare progress, ensuring breakthroughs align with moral standards and prioritize animal welfare..

  • Alivia Kaylor, MSc

With their unique biological features, octopuses have become valuable subjects in medical research. Their extraordinary abilities, such as camouflage, regenerative capabilities, and highly developed nervous systems, make them captivating candidates for studies that could lead to innovative healthcare technology and medical research breakthroughs.

However, using octopuses as an animal model in research raises ethical concerns and regulatory challenges that must be addressed.

Ethical Oversight and Regulatory Framework

Ethical and regulatory oversight is crucial in the United States, where the exact number of octopuses used in biomedical research is not readily available. Research institutions must adhere to guidelines set forth by esteemed agencies like the National Institutes of Health (NIH) and the US Department of Agriculture (USDA). Institutional Animal Care and Use Committees (IACUCs) are also pivotal in reviewing and approving animal research protocols, ensuring they meet ethical standards and comply with applicable regulations.

Octopuses in the Laboratory

One area of research involving octopuses focuses on their unique ability to match their skin's color, pattern, and texture to their surroundings. This research, led by Roger Hanlon, PhD, a scientist at the Marine Biological Laboratory in Woods Hole, Massachusetts, revealed that the skins of cuttlefish and squid are full of light-detecting molecules, also known as opsins. These opsins are the same as those found in the animals’ retinas, suggesting they can perceive light through their skin. Research later confirmed that opsins are present in octopus skin.

Although these studies offer insights into biomimicry for military applications and medical devices, octopuses may not be the ideal animal model due to their high intelligence. Unlike rodents, octopuses are not as cooperative in controlled settings, leading to difficulties in certain types of studies.

The octopus surpasses the scope of [Hanlon’s] study. When placed in a tank, its immediate instinct is to thoroughly explore its surroundings, followed by attempts to break free, elucidated Amy Hancock-Ronemus, DVM, a former veterinarian at the University of Chicago’s Marine Biological Laboratory. Their focus isn't necessarily fixated on conforming to the tank's bottom, as their intelligence likely informs them that such conformity is unnecessary.

According to Hancock-Ronemus, the Marine Biological Laboratory’s policy is that all research animals are to be treated humanely. “Any lab animal veterinarian will tell you we need to give animals the benefit of the doubt,” she said to the American Veterinary Medical Association (AVMA). “We know any animal can feel stress and distress, and there’s really no good argument for not giving them the most humane care we know of.”

The Cost of Research Models

Octopuses, being more complex organisms, present challenges in terms of cost and maintenance compared to traditional laboratory animals like rodents. Rodents like rats have been widely used in research due to their availability, ease of handling, and lower costs.

Meanwhile, octopuses require specific environmental conditions and are generally more expensive to acquire and maintain. Researchers often choose models based on the relevance to their study, ethical considerations, and budget constraints.

Challenges in Aquarium Settings

Breeding octopuses in captivity is notoriously tricky, leading to most research subjects and aquarium displays being wild-caught. The challenges include creating modified tanks without internal vents or openings and providing secure lids to prevent escapes. Health issues, such as parasitic and bacterial infections, are common in captive octopuses, and addressing these concerns adds another layer of complexity to their care.

Octopuses, known for their high intelligence, require frequent mental stimulation in captivity. According to the AVMA, aquarium staff at Shedd Aquarium engage octopuses in training sessions to monitor their behavior and health. The staff noted that each octopus exhibits unique preferences and personalities, showcasing the need for specialized care and attention in research settings.

Each octopus has its own food preferences or favorite item or toy incorporated into the enrichment activities, as outlined by Eve Barrs, a former aquarist at Shedd Aquarium in Chicago. One octopus liked ice cube trays, while another was inseparable from a whiffle ball.

"He always kept it near, nestled in his tentacle," she reminisced. "He would bring it along during feedings and carefully carry it back to his den afterward."

Biomedical Breakthroughs from Octopus Research

Despite these mounting challenges, octopuses have contributed significantly to various biomedical studies.

Camouflage and Adaptive Coloration

Octopuses are masters of camouflage and can change the color and texture of their skin to blend into their surroundings. Researchers have studied the mechanisms behind this ability to develop a deception technology platform that is used in various fields, such as the military, medicine, robotics, and sustainable energy.

Regenerative Abilities

Octopuses have impressive regenerative abilities and can regrow arms that have been injured or amputated. Scientists are interested in understanding this regenerative capability's molecular and cellular processes, as it may provide insights into human regenerative medicine.

Neuroscience and Intelligence

Octopuses have highly developed nervous systems and complex behaviors. Studying their brains and nervous systems can provide insights into neurobiology and potentially inspire advancements in artificial intelligence .

Sucker Function

The suckers on an octopus's arms have a remarkable ability to manipulate objects with great dexterity. Researchers have examined the structure and function of these suckers to develop soft robotics and prosthetic limbs with improved gripping capabilities.

Venom Research

Octopus species possess venom that they use for predation or defense. Scientists have investigated the chemical composition of octopus venom for potential medical applications. For example, a team of Spanish and Australian researchers are studying the tumor-fighting properties of a group of synthetically produced venom compounds from various marine animals.

"The octopus peptide stops the proliferation of BRAF-mutated melanoma," said Maria Ikonomopoulousaid, PhD, the Institute for Molecular Bioscience at the University of Queensland, to Newsweek . "In addition, it is safe to be used at high doses; it is not toxic. Therefore, in combination with other FDA-approved melanoma drugs/management, treatments could potentially achieve better and safer patient outcomes."

While these studies have provided valuable insights and potential applications, the field of biomimicry, where nature's designs and processes inspire human innovation, is still ongoing, and many breakthroughs are in the earliest stages of development.

Recent Developments in Regulatory Oversight

In the US, cephalopods have been excluded from certain regulations covering laboratory animals due to their invertebrate status. However, recent initiatives by the NIH propose guidelines for the care and use of cephalopods in research. This marks a significant step toward ensuring ethical treatment and oversight, requiring researchers to obtain approval from IACUCs. The proposed guidance addresses factors such as justification for using cephalopods, sedation and anesthesia, and the impact of experimental procedures on the animals' well-being.

The Physicians Committee , alongside researchers, advocates, and Congress , has played a pivotal role in urging regulatory bodies to establish better protections for cephalopods used in research. The efforts to amend the definition of "animal" in the Public Health Service Policy on Humane Care and Use of Laboratory Animals (PHS Policy) to include cephalopods have gained momentum. While recently proposed guidance by the NIH is a positive step, ongoing advocacy work emphasizes the need for continuous efforts to ensure the humane treatment of cephalopods and the exploration of non-animal research methods.

As the healthcare technology industry evolves, striking a balance between scientific advancements and ethical treatment of research subjects, including octopuses, remains crucial for the future of medical breakthroughs.

Editor's Note: This article has been updated to note that Eve Barrs is a former Chicago's Shedd Aquarium employee and that Amy Hancock-Ronemus, DVM, is a former veterinarian at the University of Chicago’s Marine Biological Laboratory.

Dig Deeper on Clinical trials and evidence

research ethical problems

WHO identifies first bird flu death in 59-year-old man

VeronicaSalib

Octopus Deploy reels in Codefresh for GitOps expertise

BethPariseau

Understanding the Pharmaceutical Drug Development Life Cycle

research ethical problems

House of Representatives Votes to Declassify COVID-19 Origins

A national survey revealed that while many Americans are comfortable with artificial intelligence in healthcare, some have ...

AI revealed insights into the relationship between lung cancer therapy and heart complications, which could help minimize cardiac...

Houston Methodist has successfully piloted ambient clinical intelligence and AI tools to bolster efficiency and cost savings ...

Understanding the quality bonus payment program is critical to appreciating Medicare Advantage and discussing the calls for ...

Only 45% of individuals who received a surprise bill challenged it and only 43% of those who were denied coverage appealed the ...

Medicare includes four different segments that each insure a wide variety of services and supplies for enrollees.

Epic's goal to bring customers live on TEFCA by the end of 2025 complements Carequality's plan to align with TEFCA as the ...

Key considerations for selecting an EHR vendor include assessing practice needs, conducting a thorough market scan and evaluating...

The proposed rule, which is available for public comment until October 8, 2024, would require HHS contractors to use certified ...

NIST encouraged organizations to implement its three post-quantum cryptography standards to prepare for the emergence of powerful...

A healthcare data breach at Alabama Cardiovascular Group affected upwards of 280,000 individuals after an unauthorized party ...

Lawsuits often follow a healthcare data breach, but understanding what drives litigation trends can help healthcare organizations...

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Navigating Complex, Ethical Problems in Professional Life: a Guide to Teaching SMART Strategies for Decision-Making

Tristan mcintosh.

1 Bioethics Research Center, Division of General Medical Sciences, Department of Medicine, Washington University School of Medicine in St. Louis, St. Louis, MO, USA

Alison L. Antes

James m. dubois.

This article demonstrates how instructors of professionalism and ethics training programs can integrate a professional decision-making tool in training curricula. This tool can help trainees understand how to apply professional decision-making strategies to address the threats posed by a variety of psychological and environmental factors when they are faced with complex professional and ethical situations. We begin by highlighting key decision-making frameworks and discussing factors that may undermine the use of professional decision-making strategies. Then, drawing upon findings from past research, we present the “SMART” professional decision-making framework: seeking help, managing emotions, anticipating consequences, recognizing rules and context, and testing assumptions and motives. Next, we present a vignette that poses a complex ethical and professional challenge and illustrate how each professional decision-making strategy could or should be used by characters in the case. To conclude, we review a series of educational practices and pedagogical tools intended to help trainers facilitate trainee learning, retention, and application of “SMART” decision-making strategies.

Our aim is to illustrate how to effectively educate professionals on ways to apply decision-making strategies when they are faced with complex professional and ethical issues. Appropriate and effective application of these strategies is a trainable skill that can be developed in individuals from a range of backgrounds, disciplines, and career stages. We first explore the complexities of professional decision-making in a research context and highlight an innovative compensatory strategy framework. Then, we present a case example of proper and improper application of these strategies when navigating complex professional and ethical situations. We then showcase pedagogical techniques intended to integrate these compensatory strategies into training activities and facilitate retention and application of these strategies. The term “trainees” is used throughout and refers to any individual, regardless of career stage, who learns from and takes part in training on professional decision-making strategies. In sum, the intent of the present effort is to describe how to provide trainees with strategy-based knowledge and skills needed for professional decision-making. These strategies ultimately serve to facilitate better ethical decision-making and professionalism.

Professional Decision-Making Frameworks

Professionals, including those who conduct research, regularly face complex circumstances that require professional decision-making skills. Although professionalism has been defined in multiple ways, for the purposes of the present effort, we define professionalism as integrating ethics and other relevant factors (e.g., competence, collegiality, institutional and departmental culture) needed to ensure public trust and achieve the goals of the profession (e.g., healing in medicine, generating new knowledge in research) ( Stern and Papadakis 2006 ; Swick 2000 ; van Mook et al. 2009 ). Unfortunately, the nature of situations professionals encounter and unconscious self-serving biases all professionals have may undermine the effectiveness of professional decision-making. Therefore, professional decision-making necessitates careful navigation and includes weighing different options to address the issue at hand, forecasting likely implications of actions, and gathering more information from multiple reliable sources ( Antes et al. 2010 ; Stenmark et al. 2011 ).

Two different frameworks of professional decision-making can be useful when professionals are confronted with these challenging circumstances: 1) a rational decision-making framework ( Goodwin et al. 1998 ; Oliveira 2007 ), and 2) a psychological framework ( DuBois et al. 2015a ; Mumford et al. 2008 ). Rational decision-making, also referred to as normative decision-making, is characterized by adherence to a set of established principles that guide decision-making, often in a group setting ( Hoch et al. 2001 ; Oliveira 2007 ). Specifically, rational decision-making involves the identification of key components of a situation and justifying decisions related to this situation when different viewpoints are in conflict with one another ( DuBois 2008b , 2013 ). Moreover, those who engage in rational decision-making analyze a number of possible alternative outcomes prior to making a definitive choice and make their decision based on the most likely and best possible outcome ( Hoch et al., 2001 ). This type of decision-making lends itself well to circumstances when professionals are unsure how to address an ethical dilemma, when a group is trying to establish best policies, or when there is disagreement among stakeholders on issues such as relevant facts and norms ( DuBois 2013 ). As it relates to ethical dilemmas, rational decision-making facilitates identification of key ethical concerns that society acknowledges as integral to rational discussions about ethical issues ( DuBois 2013 ).

The psychological framework related to professional decision-making is characterized by a confluence of situational complexities and self-serving biases that influence the way people frame and approach problems ( Bazerman and Moore 2013 ; Mumford et al. 2008 ). Oftentimes, a “correct” or “best” approach to these problems may not be apparent because of factors such as conflicting interests or needing to address concerns of multiple stakeholders ( Dana and Loewenstein 2003 ; Mumford et al. 2007 ; Weick et al. 2005 ). Being able to make sense of and effectively respond to these problems hinges on one’s ability to manage biases and attend to and utilize relevant information appropriately. This approach to professional decision-making lends itself well to situations in which professionals, when faced with complex ethical dilemmas, intend to take the best course of action but have difficulty doing so due to personal and environmental constraints ( Antes 2013 ). Such constraints may include complexity of social dynamics, heightened emotions, conscious and unconscious biases, uncertainty, and ambiguity.

The present effort will highlight strategies intended to facilitate the psychological framework of professional decision-making, as opposed to rational decision-making, because these strategies enable bias management and quality information integration, application, and synthesis. Moreover, these strategies are beneficial in situations where environmental constraints act to undermine an individual’s intent to take the best possible course of action. These strategies help professionals deal with moral distress, situational limitations (e.g., political tensions, increases in regulations, cultural differences), and internal limitations (e.g., ignoring key elements of a situation, self-centered thinking, unwarranted certainty) ( DuBois et al. 2016 , 2015b ).

In what follows, we will demonstrate the utility of a psychological decision-making framework within the context of the research profession, the SMART professional decision-making framework: s eeking help, m anaging emotions, a nticipating consequences, r ecognizing rules and context, and t esting assumptions and motives. Research provides a useful context for illustrating SMART strategy training because research frequently involves complexity, ambiguity, assumptions, stress, ethical considerations, and conflicts of interest. Further, ethics training is mandated for all federally-funded research trainees and many key personnel on grants involving human or animal subjects. We believe the SMART professional decision-making framework can add value to ethics training programs in research and other professions.

Constraints to Professional Decision-Making

Several factors can interfere with optimal professional decision-making. We discuss four factors that can be effectively addressed through the use of SMART decision-making strategies: Complexity, ambiguity, biases, and unusually high or negative emotions.

Professionals must carefully address and navigate complex and dynamic issues throughout their careers. For researchers, complexity often characterizes data management, mentoring relationships, protection of research participants, institutional hierarchies, and conflicts of interest ( Anderson et al. 2007 ; DuBois 2008a ; Jahnke and Asher 2014 ). These issues oftentimes involve multiple competing goals, guidelines, and stakeholder interests and are not simple to address ( Werhane 2008 ).

For example, a researcher may have competing interests between their funding agency’s research priorities and their own profession’s methodological norms and standards. These conflicting interests and complex relationships between funding agencies and researchers may undermine confidence in the quality of research being conducted if not appropriately managed ( Irwin 2009 ). Researchers are responsible for identifying and navigating conflicts of interest. Navigating conflicts of interest necessitates reconciling conflicting values, perspectives, and agendas of multiple stakeholders at the individual, institutional, governmental, and national levels. Failing to do so may result in public mistrust of research, harm to others, tarnished personal and professional relationships, or ruined careers. Thus, professional decision-making strategies can be applied when attempting to identify, prioritize, and reconcile complex stakeholder interests. The multifaceted nature of ethical and professional situations in a research context has the potential to derail professional decision-making if not handled appropriately.

Uncertainty

It is common for individuals in research fields to be exposed to new and unfamiliar environments and projects where considerable gaps in knowledge may exist. Uncertainty may arise when regulations grow in complexity over time, when a researcher moves into a new research space, or when a researcher moves to a new nation with a different culture or an unfamiliar set of rules and norms ( Antes et al. 2017 ; DuBois et al. 2016 ). Navigating social and professional life in a new culture, with a new language, and with possibly different ethical standards can be challenging and stressful.

Uncertainty may inadvertently lead to the misinterpretation of norms and other social and professional cues integral to making professional and ethical decisions ( Palazzo et al. 2012 ). This is because individuals may lack essential information needed for interpreting a given situation appropriately ( Sonenshein 2007 ), which may result in failure to think of long-term downstream consequences of their actions or failure to consider the entire range of possible courses of action. Moreover, “unknown unknowns” may result in a breakdown of quality professional decision making if help is not sought from other individuals or resources that are able to provide sound guidance on these issues.

For example, a lab manager may task a new postdoctoral fellow with collecting data from participants using a certain technique, but the postdoc may be unfamiliar with the standard procedures for doing so. Tense lab dynamics between the lab manager and other lab members may worsen this uncertainty by making it uncomfortable or difficult for the postdoc to seek help from another lab mate. Similarly, these lab dynamics may signal to the postdoc that limited or hostile communication is the norm in the lab, which may prompt the postdoc to proceed with their work in isolation. Failure to seek help due to social ambiguities may result in costly protocol violations or detrimental outcomes for both the participants and researchers involved. Without proper use of professional decision-making strategies, facing uncertainty or unfamiliar norms may lead to poor decision-making and negative consequences.

Despite even the best intentions to maintain objectivity, professionals may be subject to unconscious biases when processing information ( Hammond et al. 1998 ; Kahneman 2003 ; Palazzo et al. 2012 ). This poses a considerable challenge to professionals who aim to accurately and objectively process available information relevant to a given situation and to make a sensible, unbiased decision ( Bazerman and Moore 2013 ). Many of these judgment errors, or cognitive distortions, are automatic, making it challenging for individuals to fully understand the negative influence of biases on decision-making and information processing ( Kahneman 2003 ; Moore and Loewenstein 2004 ). Biases such as rationalization ( Davis et al. 2007 ; DuBois et al. 2015a ), tunnel vision ( Posavac et al. 2010 ), self-preservation ( Bandura et al. 1996 ; Oreg and Bayazit 2009 ), rigorous adherence to the status-quo ( Samuelson and Zeckhauser 1988 ), and diffusion of responsibility ( Voelpel et al. 2008 ) may contribute to flawed decision-making on the part of professionals.

To illustrate, a researcher may cut corners during the informed consent process as they think to themselves, “nobody reads consent forms anyway” (i.e., assuming the worst) ( DuBois et al. 2015a ). In yet another example, a researcher may decide to drop outliers from a dataset without reporting it as they think to themselves, “it’s not like I fabricated any data” (i.e., euphemistic comparison). Both of these examples depict poor professionalism. These biased behaviors may occur subconsciously or be actively justified by an individual as in the cases above ( DuBois et al. 2015a ). Regardless, the characters in these examples failed to utilize professional decision-making strategies that could have helped inoculate against the effects of detrimental self-serving biases.

While professional decision-making requires a certain degree of objective and rational thought in order to be successful, professionals are not always rational and objective in their approach to making decisions ( Kahneman et al. 2011 ; Tenbrunsel et al. 2010 ). It is easy to see how heightened emotions could undermine professional decision-making, for example, when working long hours, applying for intensely competitive grant funding, dealing with a difficult colleague, or trying to impress a world-famous and notably erudite senior faculty member. Stress, negative emotions, or intense emotions left unregulated or unacknowledged have been shown to lessen the cognitive resources needed for effective professional decision-making ( Gino et al. 2011 ; Haidt 2001 ; Mead et al. 2009 ). When cognitive resources are depleted, reasoning is impaired and individuals tend to make hasty, biased decisions ( Angie et al. 2011 ; Bazerman and Moore 2013 ; Gross 2013 ). Professional decision-making strategies can help counteract these effects.

SMART Strategies

Despite obstacles to effective professional decision-making, certain compensatory strategies exist that enable professionals to help offset these obstacles. Taking a structured approach to making these decisions can help professionals effectively apply strategies that guide ethical decision-making, bias management, and quality information processing ( Bornstein and Emler 2001 ; DuBois et al. 2018 ; Thiel et al. 2012 ). Furthermore, this systematized thought process balances the aforementioned constraints that can negatively influence professional decision-making ( DuBois et al. 2015b ).

Building on the sensemaking work of Mumford ( Mumford et al. 2008 ) and the bias reduction work of Gibbs ( Gibbs et al. 1995 ), DuBois and his colleagues ( DuBois 2014 ; DuBois et al. 2015b ) in the Professionalism and Integrity in Research Program (P.I. Program) developed a structured decision-making aid to help professionals remember and recall a comprehensive set of compensatory strategies. Strategy-based training has proven to be effective in developing cognitive skills ( Clapham 1997 ), and has met success in increasing professional decision-making in the P.I. Program ( DuBois et al. 2018 ). These strategies shape professional decision-making and help professionals work through ethical dilemmas. Professional decision-making strategies comprise the acronym “SMART”, and encompass five domains: Seek help, Manage emotions, Anticipate consequences, Recognize rules and context, and Test assumptions and motives. Table 1 depicts key dimensions of each strategy and reflection questions that can be used to apply each strategy. While these strategies have distinct components, they are related to one another and conceptually overlap. Each professional decision-making strategy is described in detail below.

SMART strategies

StrategyDimensionsReflection Questions
Help
your Emotions
Consequences
Rules and Context
your Assumptions and Motives

Seeking Help

This strategy is characterized by 1) gathering information such as facts, options, and potential outcomes, 2) requesting the mediation of an objective third party, and 3) asking for and welcoming feedback and correction. By deliberately processing context-relevant information and consulting with objective others, it is possible to correct for biases and challenge initial assumptions ( Sonenshein 2007 ). This allows the information that may have been formerly disregarded or misconstrued to be revealed and utilized effectively ( Mumford et al. 2008 ). When attempting to apply this strategy, professionals should reflect on questions such as, “Do I welcome feedback or input from others?”, “Where could I seek additional unbiased, objective information or opinions?”, or “Have I owned up to mistakes and apologized to all involved to move forward?”

Managing Emotions

The strategy of managing stress and emotion is characterized by 1) identifying the emotions being experienced, 2) managing those emotions, and 3) acknowledging both positive and negative emotions such as excitement and anxiety. When attempting to apply this strategy, professionals should ask themselves questions such as, “What are my emotional reactions to this situation?”, “How are my emotions influencing my decision-making?”, “Would taking a timeout or a deep breath help the situation?”

Anticipating Consequences

The strategy of anticipating consequences is characterized by 1) anticipating consequences to both oneself and others, 2) anticipating both long-term and short-term consequences, 3) anticipating both positive and negative consequences, 4) considering formal and informal responses, and 5) managing and mitigating risk. When attempting to apply this strategy, professionals should reflect on questions such as, “What are the likely short- and long-term outcomes of a variety of choices?”, “Who will be affected by my decisions and how?” and “How can risks be minimized and benefits be maximized?”

Recognizing Rules and Context

This strategy is characterized by 1) recognizing formal rules, such as laws and policies, 2) recognizing informal rules, such as social norms, and 3) recognizing the power dynamics of individuals involved in a given situation. Professionals attempting to apply this strategy should ask themselves, “What are the causes of the problem in this situation that I can change?”, “What ethical principles, laws, or regulations apply in this situation?”, and “Who are the decision-makers in this situation?”

Testing Assumptions and Motives

This strategy is characterized by 1) addressing the possibility you might be making faulty assumptions, 2) examining your motives compared to the motives of others, and 3) comparing your assumptions and motives with those of others in an empathetic manner. When attempting to apply this strategy, professionals should reflect on questions such as, “Could I be making faulty assumptions about the intentions of others?”, “What are my motives?”, and “How will others view my choices?”

Not only have compensatory strategies been demonstrated to be a helpful tool for high-quality professional decision-making, but these strategies are also learnable, trainable, and applicable to a wide variety of challenges and situations ( DuBois et al. 2015b ; Kligyte et al. 2008 ). The generalizability of strategies is noteworthy because they apply across contexts (e.g., human subjects research, animal research, translational research) and challenges faced by professionals (e.g., compliance, personnel management, integrity, bias). Moreover, these compensatory strategies, when applied correctly, can facilitate more critical analysis of a problem, improve information gathering and information evaluation, and contribute to better decision-making that leads individuals to make more professional and ethical decisions ( DuBois et al. 2015b ; Thiel et al. 2012 ).

Compensatory Strategy Case Application

Below we present a case with an ethical, professional dilemma and discuss how each SMART strategy can be properly applied in this example. We then caution against flawed application of these SMART strategies and highlight potential pitfalls to effective strategy application. It should be noted that, while the main character in the following case is a research assistant, applying the SMART strategies is a skill that can be learned and utilized by individuals across career stages and professions. The dilemma is as follows:

Sara is a new research assistant in the social science lab of Dr Jackson. She recently emigrated from China. Knowing that Sara is great with quantitative data analysis, Dr. Jackson asked her to run some statistics on data gathered by other research assistants on a National Science Foundation (NSF) grant that Dr. Jackson received two years ago. She ran the statistics, but none of Dr. Jackson’s hypotheses were confirmed. She thinks the study was simply under-powered. When she speaks with Dr. Jackson, he tells her she is mistaken and he asks her to run the tests again. She does so with the same results as before. This time, Dr. Jackson is angry, calls her incompetent, and says he will give her one more chance before he hires a new research assistant to run the statistics. Sara is fearful that she will lose her student visa if she loses her funded position. She drops several outliers and changes the data for several subjects and produces results that Dr. Jackson likes very much.

The above illustration is a great teaching case because, at first glance, Sara appears to be a victim: Dr. Jackson did not help her to do good work; rather, he bullied her to get the results he wanted. At the same time, the case perfectly illustrates a failure to use good decision-making strategies in a stressful situation with competing interests where few good options readily present themselves. Sara made a very bad decision: she committed research misconduct through her data falsification, the project was federally-funded, and now she and her institution could be prosecuted for this federal crime. While not every difficult situation requires the use of every one of the SMART strategies, Sara may have benefited from using each of them.

Sara could have asked other research assistants, graduate students, or postdocs for help with addressing problems with analyses and strategies for approaching and communicating with Dr. Jackson. If issues with Dr. Jackson had been persistent overtime, Sara could have sought support from colleagues or other faculty members who could provide advice for navigating the troubling work relationship. Ideally, the environment in the department would allow Sara to feel comfortable approaching another faculty member or others for help. Sara could have referred to objective field standards for conducting the analyses and determining how to proceed after unsuccessful analyses. After conducting the initial analyses, Sara could have asked a member in her lab to re-run the analyses with her in attempt to address any potential mistakes. Doing so may have affirmed her initial findings and assuaged concerns that she had approached the analyses incorrectly. Sara could have involved a mediator, such as a university ombudsperson, to help find a viable solution if she was unable to do so after exhausting the aforementioned options. A more complete picture presents itself after seeking help and additional information, and more ethical and professional courses of action become more apparent.

Because of the threat the situation poses to Sara’s personal and professional goals, emotions run high in this scenario. Sara wishes to be successful in her career and education, maintain her position in the United States, and earn Dr. Jackson’s approval. Sara is also likely aware that Dr. Jackson wishes to maintain a successful reputation in his field, publish interesting findings, and be productive throughout his career. She should introspectively identify her emotional reactions of anxiety, fear, frustration, and stress. When Sara was chastised by Dr. Jackson, she could have taken a “time out” to calm down and acknowledge how her emotions could override taking a more rational approach to addressing the problem instead of hastily reacting to Dr. Jackson’s response. At a broader level, taking time to manage stress each day would help Sara cope with the pressures and day-to-day stressors of her work. By identifying and managing the range of emotions experienced when faced with ethical and professional situations, clearer and more thoughtful judgment is likely to result.

Considering both the long-term and short-term consequences for all possible individuals is central to making a quality professional and ethical decision. Specifically, Sara should consider how falsifying data could end up negatively impacting not only her career trajectory and her immediate ability to work in the United States, but the careers and reputations of Dr. Jackson, her fellow lab mates, and the university where she works. Data falsification also undermines public trust in the field and scientific enterprise more broadly. In addition to attempting to minimize risk, Sara could have also considered how to maximize the benefits of, or make the best of, the situation. Perhaps by addressing the limitations of the analytical approach and bringing the analysis issue to light, a learning opportunity for everyone in the lab could have presented itself, paving the way for smoother management of similar situations in the future. Forecasting downstream consequences for all individuals that could be impacted by a given course of action is essential to maximizing benefits and minimizing harm to oneself and others.

Taking time to identify formal laws and policies and informal professional and social norms will help elucidate the context in which an ethical or professional dilemma unfolds. Sara could have identified the causes of problems and tensions in the situation, including publication pressures, Sara being new to the job, job stressors, and the like. By doing so, she could have more concretely comprehended the factors that limit her choices and could have avoided tunnel vision or narrow-mindedness in approaching the problem. Sara could have taken time to reflect on relevant ethical principles and regulations as they relate to falsifying data. Doing so may have cued her to not manipulate the data to obtain certain findings.

For better or worse, Dr. Jackson is her supervisor, and she must figure out a way to navigate the interpersonal problem in the case: He is upset and has threatened to fire her. Some of the strategies described above under “Seeking Help” might assist her in navigating the political dimension of this situation. Additionally, if these strategies fail, she should recognize that Dr. Jackson’s lab is situated within a larger institutional context. She could have reached out to other individuals within the university (e.g., department chair, research integrity officer) who prioritize responsible research and mentoring after exhausting alternative courses of action. These individuals, in turn, could have provided support and helped Sara navigate a path forward. Realizing the entirety of the context opens up a wider realm of options in navigating this challenging and threatening situation.

Understanding the motives of oneself and others provides the opportunity to consider multiple perspectives and take steps to avoid biased decision-making. While it can be challenging when one feels affronted, it can be helpful to consider the perspective and motives of other parties in the situation. For example, Sara could have considered whether Dr. Jackson was having a stressful day and overreacted when she initially approached him. She could have better managed self-serving and self-protecting biases perpetuated by her fear of not being allowed to work in Dr. Jackson’s lab by acknowledging how they may be distorting her perception of the situation. Sara might have questioned whether her analysis was correct; perhaps she did make an error and the study was not underpowered. That is, Sara should have questioned her assumption that, if she did conduct the analyses correctly, falsifying data was the only available option that would allow her to keep her position. Seldom is professional decision-making served well by engaging in simplistic either-or thinking. It is likely that multiple alternate courses of action would have presented themselves if she had engaged these strategies. Being proactive in managing biases by engaging in self-reflection and considering the perspectives and motives of others is beneficial to quality professional decision-making.

Questioning one’s assumptions is also a classic emotion management strategy used in cognitive behavioral therapy. Sometimes just realizing that we are making assumptions about how others perceive the situation and about our limited options can relieve anxiety.

Evaluate and Revise

If one wishes to take these strategies a step further to engage in “SMART-ER” professional decision-making, they can: 1) Evaluate their decision and its outcomes and 2) Revise future behavior in similar situations. By acknowledging what did and did not work well in past situations and attempts at strategy use, modified and improved approaches to professional decision-making can be taken when faced with other professional and ethical challenges in the future.

Considerations for Applying SMART Strategies

While the SMART strategies are an excellent tool for professional decision-making, it is equally important to recognize the several important considerations when utilizing this approach. While a five-part decision-making aid has the opportunity to be highly useful for navigating complex, ambiguous professional situations, it is not a perfect algorithm or panacea for all ethical and professional conundrums. Given situational limitations and available contextual information, it may not always be possible to use each strategy fully, and challenges navigating the problem will still exist. Not all strategies will be equally applicable across all situations and may not be applied in the same order in all situations. However, SMART strategies are generalizable to myriad contexts, professions, and dilemmas and are not limited to major ethical transgressions such as fabrication, falsification, and plagiarism.

An additional consideration for using SMART strategies is that people may have a preference for or tendency to use one strategy over the others. While the SMART strategies are interrelated, over-attending to one strategy may result in biased or incomplete information gathering and information processing and, ultimately, sub-optimal professional decisions. When individuals face emotional, stressful, or ethically-charged situations, it is important that they consider and use multiple strategies to inform well-rounded decision-making. When educating trainees on SMART strategies, educators should encourage trainees to use a balanced approach and consider multiple strategies.

Perhaps one of the most considerable challenges educators may encounter is motivating trainees to use these compensatory strategies regularly. Simply teaching the strategies does not guarantee constructive application of strategies. In situations where individuals are overconfident or rushed to solve a problem that needs to be resolved quickly, immediately turning to the SMART strategies is unlikely to be an automatic course of action. Furthermore, if individuals engage in cognitive distortions in such a way that disengages from compliance or harms quality professional decision-making, they may fail to see the need or utility of SMART strategies ( DuBois et al. 2015a ). Educators should make professionals aware of how they might fall prey to these pitfalls.

A final consideration is that other mechanisms exist aside from training professionals to use SMART strategies that reinforce the recall and application of professional decision-making strategies. Such mechanisms include creating ethical and supportive organizational and departmental cultures, developing and enacting ethical leadership and management practices, and establishing institutional policies and procedures that reinforce the use of professional decision-making strategies.

Training SMART Strategies

Below, we examine practices that are useful in conducting professional decision-making training programs and creating pedagogical tools that can be implemented by a research ethics or professionalism course instructor. We focus on training practices designed for adult learners that support their professional growth and advancement ( Knowles et al. 2012 ). This is not an exhaustive list of considerations for designing and planning for an ethics or professionalism training program, and a systematic approach should be taken when developing any instructional program ( Antes 2014 ; Antes and DuBois 2014 ). Rather, the pedagogical practices described below were selected because of their implications for the transfer of complex skills, such as professional decision-making, to the workplace after training has occurred. That is, facilitating trainee learning, retention, and application of the content learned during training is essential to improving professional decision-making and making the training successful ( Noe 2013 ).

SMART Training Program Practices

Establish learning objectives.

Prior to presenting training content, provide trainees with stated objectives of the training program that define the expected outcomes of training and what it is they will be expected to accomplish as a result of completing the training ( Moore et al. 2008 ). Doing so alerts trainees to what is important and helps consolidate learning. Learning objectives have three components: 1) a statement of expected performance standard or outcome, 2) a statement of the quality or level of expected performance, and 3) a statement of the conditions when a trainee is expected to perform the skill learned in training ( Mager 1997 ). An example learning objective for a professional decision-making, or “SMART” strategies-focused, training is: Trainees will be able to apply professional decision-making strategies when they are faced with uncertain, complex, and high-stakes professional and ethical decisions in the workplace.

Create Meaningful Content

Explaining to trainees how a SMART strategies-focused training will directly benefit them and describing how training content is specifically linked to experiences in their profession will help garner buy-in from trainees ( Smith-Jentsch et al. 1996 ). Trainee dedication to achieving learning objectives is essential for learning and retention to occur and for transferring knowledge and skills to the work environment ( Goldstein and Ford 2002 ; Slavin 1990 ). To demonstrate the benefits of training, the content of training needs to be practically useful and applicable. This includes presenting content that is relevant to trainees’ professions and that addresses ethical and professional issues they have faced or are likely to face in their careers. Discussing a case or critical incident that the trainees have encountered, or something similar to what they have encountered, is an effective way to get them engaged with and derive meaning from training content.

Engage Multiple Pedagogical Activities

Pedagogical activities that occur during training reinforce key training concepts, help trainees derive meaning from training content, and facilitate active learning of professional decision-making skills. How trainees learn is equally as important as what trainees learn during training. Integrating case studies, individual reflection activities, think-pair-share exercises, and role plays into training fosters learning more than a traditional lecture format ( Bransford et al. 1999 ; DuBois 2013 ; Handelsman et al. 2004 ). These pedagogical activities vary in terms of their complexity and length, resulting in dynamic training content. Engaging trainees with these activities provides them with the opportunity to examine and connect their existing knowledge, experiences, and perspectives to the learning material. Table 2 provides a brief overview of how to implement various pedagogical activities, along with estimated level of complexity and duration.

Pedagogical activities

ActivityActivity implementationComplexity
level
Duration
Case Studies Moderate complexityModerate to long (5 to 30 min)
Individual Reflection Low complexityShort (1 to 5 min)
Think-pair-share Moderate complexity Moderate (5 to 10 min)
Role Plays High complexityModerate to long (5 to 30 min)

Case Studies

Applied to professionalism and research, case-based learning consists of using factual or fictional scenarios to illustrate examples of complex and ambiguous ethical and professional situations researchers may face ( Bagdasarov et al. 2013 ; Johnson et al. 2012 ; Kolodner 1992 ). Case-based learning helps trainees link course concepts to realistic, real-world scenarios by immersing themselves in these scenarios and exploring how to apply professional decision-making strategies ( Miller and Tanner 2015 ). The positive effects of case-based learning are magnified when trainees work together in small groups to collectively seek out important information, ask relevant questions, and find solutions to the problem ( Allen and Tanner 2002a ). This enables greater breadth and depth of understanding of decision-making strategies that can be used to address issues related to the case. Trainees can also use what they learned during this practice when applying these decision-making skills to a situation in the future that is similar. That is, trainees can draw upon their case-based knowledge to make sense of future professional and ethical situations and navigate these situations when they arise ( Kolodner et al. 2004 ).

Individual Reflection

Because of the personal and interpersonal nature of ethical and professional problems, reflecting on personal experiences and processing cases individually reinforces the knowledge base that influences ethical and professional decision-making ( Antes et al. 2012 ). Moreover, when professionals are confronted with ethical dilemmas, they are likely to draw upon personal experiences to make sense of the dilemma and generate solutions ( Mumford et al. 2000 ; Scott et al. 2005 ; Thiel et al. 2012 ). Drawing on past experiences allows professionals to consider important aspects of these past experiences such as causes and outcomes, which are essential for effective professional decision-making ( Stenmark et al. 2010 ).

Think-Pair-Share

Think-pair-share activities consist of having students initially think about a solution to a problem individually, then pairing with a neighboring student to exchange ideas, and finally reporting out to the larger group key points from their discussion ( Allen and Tanner 2002b ). Discussion between peers enhances understanding of complex subject matter even when both trainees are uncertain initially ( Smith et al. 2009 ). This may be due to the cognitive reasoning and communication skills needed to relay and justify perspectives about complex subject matter to others. Conversely, similar evaluative skills are needed to appraise the viewpoints of the other and determine if their explanation and rationale make sense in context.

Role plays are training activities where trainees take on the role of someone in a hypothetical scenario and model what it is like to have the perspective of that character ( Thiagarajan 1996 ). For example, trainees in a role play can model social interactions between characters faced with an ethical or professional dilemma regarding authorship, human subjects protections, mentor-trainee relationships, or data management ( DuBois 2013 ). Role plays enable trainees to learn how to identify, analyze, and resolve these dilemmas because they provide trainees with the opportunity to practice navigating these situations ( Chan 2012 ; DuBois 2013 ). This technique is particularly effective in trainings that involve exploration and acquisition of complex social skills, such as professional decision-making ( Noe 2013 ). Role play activities have been shown to be effective in ethics instruction ( Mumford et al. 2008 ). They can involve a select few volunteers who perform for the class while the remainder of trainees observe, or involve all trainees divided into small groups of two or three where all trainees take part in the role play activity. Role play activities have been shown to promote a deep understanding of the complexities involved with ethical and professional dilemmas ( Brummel et al. 2010 ).

In order to be effective, however, certain activities must take place before, during, and after the role play ( Noe 2013 ; Thiagarajan 1996 ). Specifically, before the role play, trainees should be provided with background information that gives context for the role play and a script with adequate detail for trainees to understand their role. During the role play, actors and observers should be able to hear and see one another, and trainees should be provided with a handout detailing the key issues of the role play scenario. After the role play has commenced, both actors and observers should debrief on their experience, how the role play relates to the concepts being taught in training, and key takeaways. Trainees should also be provided with feedback in order to reinforce what was learned during the role play experience ( Jackson and Back 2011 ).

Provide Practice Opportunities

Trainees will need multiple opportunities to practice applying the professional decision-making skills they are learning. Practice opportunities can take the form of the various pedagogical tools, as discussed above, including case studies, individual reflection, and role-play activities. These tools promote active learning and create a safe mechanism for trainees to experiment with SMART strategy application ( Bell and Kozlowski 2008 ). Instructors should also have trainees periodically recall the SMART strategies throughout training. This active recall will increase the likelihood of strategy use beyond practice during training.

Give Feedback

Immediately after each practice activity, instructors should provide feedback to trainees by noting what was done well and where there are opportunities for change or improvement. Feedback should be specific and frequent in order to convey to trainees what resulted in poor professional decision-making performance and good professional decision-making performance ( Gagné and Medsker 1996 ). Carefully guiding feedback-oriented discussions can further enhance learning, retention, and application of SMART strategies.

Professionals across various fields, especially in research contexts, encounter complex situations involving multiple stakeholders that necessitate professional decision-making skills. Fortunately, these skills are trainable, and the SMART strategies decision tool helps facilitate professional decision-making skill retention and application. In the present effort, we approach professional decision-making using a compensatory strategy framework and showcase how each of the SMART strategies could be applied to a scenario involving a professional dilemma. We also discuss how to maximize the effects of a SMART strategy-oriented training program and highlight pedagogical tools to guide SMART strategy education.

This paper provides a guide for educators and institutions with the goal of integrating training on professional decision-making skills into their curriculum. We provide educators with a robust understanding of the steps involved in mitigating negative effects of self-serving biases and making sense of complex professional dilemmas. Additionally, we discuss the individual-level and environmental-level constraints that influence the way problems are framed and approached, and the strategies that individuals can use to counteract the negative effects of these constraints on decision-making. Educators can take this understanding, along with the knowledge of effective training and pedagogical practices, to create training content that prepares its trainees to effectively navigate multifaceted professional issues they may face in their careers.

Acknowledgments

We would like to thank John Gibbs, John Chibnall, Raymond Tait, Michael Mumford, Shane Connelly, and Lynn Devenport for their insight and prior work that led to many of the ideas discussed in this manuscript.

Funding/Support This paper was supported by the National Center for Advancing Translational Sciences (UL1 TR002345). The development of the Professionalism and Integrity in Research Program (PI) was funded by a supplement to the Washington University Clinical and Translational Science award (UL1 TR000448). The U.S. Office of Research Integrity provided funding to conduct outcome assessment of the PI Program (ORIIR140007). The effort of ALA was supported by the National Human Genome Research Institute (K01HG008990).

Conflict of Interest The authors declare that they have no conflicts of interest.

  • Allen D, & Tanner K (2002a). Answers worth waiting for: One second is hardly enough . Cell Biology Education , 1 , 3–5. [ Google Scholar ]
  • Allen D, & Tanner K (2002b). Approaches to cell biology teaching: Questions about questions . Cell Biology Education , 1 , 63–67. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Anderson MS, Horn AS, Risbey KR, Ronning EA, De Vries R, & Martinson BC (2007). What do mentoring and training in the responsible conduct of research have to do with scientists' misbehavior? Findings from a National Survey of NIH-funded scientists . Academic Medicine: Journal of The Association of American Medical Colleges , 82 , 853–860. 10.1097/ACM.0b013e31812f764c. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Angie AD, Connelly S, Waples EP, & Kligyte V (2011). The influence of discrete emotions on judgement and decision-making: A meta-analytic review . Cognition & Emotion , 25 , 1393–1422. [ PubMed ] [ Google Scholar ]
  • Antes AL (2013). An ethics instructor’s guide to sensemaking as a framework for case-based learning (Vol. 1 ): Office of Research Integrity. [ Google Scholar ]
  • Antes AL (2014). A systematic approach to instruction in research ethics . Accountability in Research: Policies and Quality Assurance , 21 , 50–67. 10.1080/08989621.2013.822269. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Antes AL, & DuBois JM (2014). Aligning objectives and assessment in responsible conduct of research instruction . Journal Of Microbiology & Biology Education , 15 , 108–116. 10.1128/jmbe.v15i2.852. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Antes AL, Wang X, Mumford MD, Brown RP, Connelly S, & Devenport L (2010). Evaluating the effects that existing instruction on responsible conduct of research has on ethical decision making . Academic Medicine: Journal of the Association of American Medical Colleges , 85 , 519–526. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Antes AL, Thiel CE, Martin LE, Stenmark CK, Connelly S, Devenport LD, & Mumford MD (2012). Applying cases to solve ethical problems: The significance of positive and process-oriented reflection . Ethics & Behavior , 22 , 113–130. 10.1080/10508422.2012.655646. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Antes AL, English T, Baldwin KA, & DuBois JM (2017). The role of culture and acculturation in researchers' perceptions of rules in science . Science and Engineering Ethics , 24 , 1–31. 10.1007/s11948-017-9876-4. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bagdasarov Z, Thiel CE, Johnson JF, Connelly S, Harkrider LN, Devenport LD, & Mumford MD (2013). Case-based ethics instruction: The influence of contextual and individual factors in case content on ethical decision-making . Science and Engineering Ethics , 19 , 1305–1322. 10.1007/s11948-012-9414-3. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bandura A, Barbaranelli C, & Caprara GV (1996). Mechanisms of moral disengagement in the exercise of moral agency . Journal of Personality and Social Psychology , 71 , 364–374. 10.1037/0022-3514.80.1.125. [ CrossRef ] [ Google Scholar ]
  • Bazerman MH, & Moore DA (2013). Judgment in managerial decision making (8th ed.). New York: Wiley. [ Google Scholar ]
  • Bell BS, & Kozlowski SW (2008). Active learning: Effects of core training design elements on self-regulatory processes, learning, and adaptability . Journal of Applied Psychology , 93 , 296–316. [ PubMed ] [ Google Scholar ]
  • Bornstein BH, & Emler AC (2001). Rationality in medical decision making: A review of the literature on doctors’ decision-making biases . Journal of Evaluation in Clinical Practice , 7 , 97–107. [ PubMed ] [ Google Scholar ]
  • Bransford JD, Brown A, & Cocking R (1999). How people learn: Mind, brain, experience, and school . Washington, DC: National Research Council. [ Google Scholar ]
  • Brummel BJ, Gunsalus C, Anderson KL, & Loui MC (2010). Development of role-play scenarios for teaching responsible conduct of research . Science and Engineering Ethics , 16 , 573–589. [ PubMed ] [ Google Scholar ]
  • Chan ZC (2012). Role-playing in the problem-based learning class . Nurse Education in Practice , 12 , 21–27. [ PubMed ] [ Google Scholar ]
  • Clapham MM (1997). Ideational skills training: A key element in creativity training programs . Creativity Research Journal , 10 , 33–44. [ Google Scholar ]
  • Dana J, & Loewenstein G (2003). A social science perspective on gifts to physicians from industry . Journal of the American Medical Association , 290 , 252–255. 10.1001/jama.290.2.252. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Davis MS, Riske-Morris M, & Diaz SR (2007). Causal factors implicated in research misconduct: Evidence from ORI case files . Science and Engineering Ethics , 13 , 395–414. 10.1007/s11948-007-9045-2. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • DuBois JM (2008a). Identifying and managing conflicts of interest. In Ethics in mental health research: Principles, guidance, and cases (1st ed., pp. 202–224). Oxford, New York: Oxford University Press. [ Google Scholar ]
  • DuBois JM (2008b). Solving ethical problems: Analyzing ethics cases and justifying decisions : New York: Oxford University Press. [ Google Scholar ]
  • DuBois JM (2013). ORI casebook: Stories about researchers worth discussing .
  • DuBois JM (2014). Strategies for professional decision making: The SMART approach . St. Louis: Professionalism and Integrity in Research Progra. [ Google Scholar ]
  • DuBois JM, Chibnall JT, & Gibbs JC (2015a). Compliance disengagement in research: Development and validation of a new measure . Science and Engineering Ethics , 22 , 965–988. 10.1007/s11948-015-9681-x. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • DuBois JM, Chibnall JT, Tait RC, Vander Wal JS, Baldwin KA, Antes AL, & Mumford MD (2015b). Professional decision-making in research (PDR): The validity of a new measure . Science and Engineering Ethics , 22 , 391–416. 10.1007/s11948-015-9667-8. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • DuBois JM, Chibnall JT, Tait RC, & Vander Wal JS (2016). Lessons from researcher rehab . Nature , 534 , 173–175. [ PubMed ] [ Google Scholar ]
  • DuBois JM, Chibnall JT, Tait RC, & Vander Wal JS (2018). The professionalism and integrity in research program: Description and preliminary outcomes . Academic Medicine , 93 , 586–592. 10.1097/acm.0000000000001804. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gagné RM, & Medsker KL (1996). The conditions of learning: Training applications . Fort Worth, TX: Harcourt-Brace. [ Google Scholar ]
  • Gibbs JC, Potter G, & Goldstein A (1995). The EQUIP program: Teaching youth to think and act responsibly . Champaign, IL: Research Press. [ Google Scholar ]
  • Gino F, Schweitzer ME, Mead NL, & Ariely D (2011). Unable to resist temptation: How self-control depletion promotes unethical behavior . Organizational Behavior and Human Decision Processes , 115 , 191–203. 10.1016/j.obhdp.2011.03.001. [ CrossRef ] [ Google Scholar ]
  • Goldstein IL, & Ford KJ (2002). Training in organizations: Needs assessment, development, and evaluation (4th ed.). Belmont, CA: Wadsworth/Thomson Learning. [ Google Scholar ]
  • Goodwin P, Wright G, & Phillips LD (1998). Decision analysis for management judgment . Chichester, England: John Wiley & Sons, Ltd.. [ Google Scholar ]
  • Gross JJ (2013). Emotion regulation: Taking stock and moving forward . Emotion , 13 , 359–365. 10.1037/a0032135. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Haidt J (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment . Psychological Review , 108 , 814–834. 10.1037//0033-295x.108.4.814. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Hammond JS, Keeney RL, & Raiffa H (1998). The hidden traps in decision making . Harvard Business Review , 76 , 47–58. [ PubMed ] [ Google Scholar ]
  • Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, Gentile J, Lauffer S, Stewart J, Tilghman S, & Wood W (2004). Scientific teaching . Science , 304 , 521–522. [ PubMed ] [ Google Scholar ]
  • Hoch, Kunreuther H, & Gunther R (2001). Wharton on making decisions . New York: John Wiley & Sons, Inc.. [ Google Scholar ]
  • Irwin RS (2009). The role of conflict of interest in reporting of scientific information . Chest , 136 , 253–259. 10.1378/chest.09-0890. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Jackson VA,& Back AL (2011). Teaching communication skills using role-play: An experience-based guide for educators . Journal of Palliative Medicine , 14 , 775–780. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Jahnke LM, & Asher A (2014). The problem of data: Data management and curation practices among university researchers . In Council on Library and Information Resources Retrieved from http://www.clir.org/pubs/reports/pub154/problem-of-data . [ Google Scholar ]
  • Johnson JF, Bagdasarov Z, Connelly S, Harkrider LN, Devenport LD, Mumford MD, & Thiel CE (2012). Case-based ethics education: The impact of cause complexity and outcome favorability on ethicality . Journal of Empirical Research on Human Research Ethics , 7 , 63–77. 10.1525/jer.2012.7.3.63. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman D (2003). A perspective on judgment and choice - mapping bounded rationality . American Psychologist , 58 , 697–720.i 10.1037/0003-066x.58.9.697. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Kahneman D, Lovallo D, & Sibony O (2011). Before You Make That Big Decision … Harvard Business Review , 89 , 50–60. Retrieved from <Go to ISI>://WOS:000290694700034. [ PubMed ] [ Google Scholar ]
  • Kligyte V, Marcy RT, Waples EP, Sevier ST, Godfrey ES, Mumford MD, & Hougen DF (2008). Application of a sensemaking approach to ethics training in the physical sciences and engineering . Science and Engineering Ethics , 14 , 251–278. 10.1007/s11948-007-9048-z. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Knowles MS, Holton III EF, & Swanson RA (2012). The adult learner (7th ed.). Oxford: Butterworth-Heinemann. [ Google Scholar ]
  • Kolodner JL (1992). An introduction to case-based reasoning . Artificial Intelligence Review , 6 , 3–34. [ Google Scholar ]
  • Kolodner JL, Owensby JN, & Guzdial M (2004). Case-based learning aids . Handbook of Research on Educational Communications and Technology , 2 , 829–861. [ Google Scholar ]
  • Mager RF (1997). Preparing instructional objectives (5th ed. ed.). Belmont: Lake Publishing. [ Google Scholar ]
  • Mead NL, Baumeister RF, Gino F, Schweitzer ME, & Ariely D (2009). Too tired to tell the truth: Self-control resource depletion and dishonesty . Journal of Experimental Social Psychology , 45 , 594–597. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Miller S, & Tanner KD (2015). A portal into biology education: An annotated list of commonly encountered terms . CBE—Life Sciences Education , 14 , fe2. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Moore DA, & Loewenstein G (2004). Self-interest, automaticity, and the psychology of conflict of interest . Social Justice Research , 17 , 189–202. [ Google Scholar ]
  • Moore S, Ellsworth JB, & Kaufman R (2008). Objectives—Are they useful? A quick assessment . Performance Improvement Quarterly , 47 , 41–47. [ Google Scholar ]
  • Mumford MD, Zaccaro SJ, Harding FD, Jacobs TO, & Fleishman EA (2000). Leadership skills for a changing world: Solving complex social problems . The Leadership Quarterly , 11 , 11–35. [ Google Scholar ]
  • Mumford MD, Friedrich TL, Caughron JJ, & Byrne CL (2007). Leader cognition in real-world settings: How do leaders think about crises? The Leadership Quarterly , 18 , 515–543. 10.1016/j.leaqua.2007.09.002. [ CrossRef ] [ Google Scholar ]
  • Mumford MD, Connelly S, Brown RP, Murphy ST, Hill JH, Antes AL, Waples EP, & Devenport LD (2008). A sensemaking approach to ethics training for scientists: Preliminary evidence of training effectiveness . Ethics & Behavior , 18 , 315–339. 10.1080/10508420802487815. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Noe RA (2013). Employee training and development (6th ed.): McGraw-Hill. [ Google Scholar ]
  • Oliveira A (2007). A discussion of rational and psychological decision-making theories and models: The search for a cultural-ethical decision-making model . Electronic Journal of Business Ethics and Organization Studies , 12 , 12–13. [ Google Scholar ]
  • Oreg S, & Bayazit M (2009). Prone to bias: Development of a bias taxonomy from an individual differences perspective . Review of General Psychology , 13 , 175–193. [ Google Scholar ]
  • Palazzo G, Krings F, & Hoffrage U (2012). Ethical blindness . Journal of Business Ethics , 109 , 323–338. 10.1007/s10551-011-1130-4. [ CrossRef ] [ Google Scholar ]
  • Posavac SS, Kardes FR, & Brakus JJ (2010). Focus induced tunnel vision in managerial judgment and decision making: The peril and the antidote . Organizational Behavior and Human Decision Processes , 113 , 102–111. [ Google Scholar ]
  • Samuelson W, & Zeckhauser R (1988). Status quo bias in decision making . Journal of Risk and Uncertainty , 1 , 7–59. [ Google Scholar ]
  • Scott GM, Lonergan DC, & Mumford MD (2005). Conceptual combination: Alternative knowledge structures, alternative heuristics . Creativity Research Journal , 17 , 79–98. [ Google Scholar ]
  • Slavin RE (1990). Cooperative learning: Theory, research, and practice . Englewood Cliffs, NJ: Prentice-Hall. [ Google Scholar ]
  • Smith MK, Wood WB, Adams WK, Wieman C, Knight JK, Guild N, & Su TT (2009). Why peer discussion improves student performance on in-class concept questions . Science , 323 , 122–124. [ PubMed ] [ Google Scholar ]
  • Smith-Jentsch KA, Jentsch FG, Payne SC, & Salas E (1996). Can pretraining experiences explain individual differences in learning? Journal of Applied Psychology , 81 , 110–116. [ Google Scholar ]
  • Sonenshein S (2007). The role of construction, intuition, and justification in responding to ethical issues at work: The sensemaking-intuition model . Academy of Management Review , 32 , 1022–1040. [ Google Scholar ]
  • Stenmark CK, Antes AL, Wang X, Caughron JJ, Thiel CE, & Mumford MD (2010). Strategies in forecasting outcomes in ethical decision-making: Identifying and analyzing the causes of the problem . Ethics & Behavior , 20 , 110–127. doi:Pii 920034165. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Stenmark CK, Antes AL, Thiel CE, Caughron JJ, Wang XQ, & Mumford MD (2011). Consequences identification in forecasting and ethical decision-making . Journal of Empirical Research on Human Research Ethics , 6 , 25–32. 10.1525/jer.2011.6.1.25. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Stern DT, & Papadakis M (2006). The developing physician—Becoming a professional . New England Journal of Medicine , 355 , 1794–1799. [ PubMed ] [ Google Scholar ]
  • Swick HM (2000). Toward a normative definition of medical professionalism . Academic Medicine , 75 , 612–616. [ PubMed ] [ Google Scholar ]
  • Tenbrunsel AE, Diekmann KA, Wade-Benzoni KA, & Bazerman MH (2010). The ethical mirage: A temporal explanation as to why we are not as ethical as we think we are . Research in Organizational Behavior: An Annual Series of Analytical Essays and Critical Reviews , 30 , 153–173. 10.1016/j.riob.2010.08.004. [ CrossRef ] [ Google Scholar ]
  • Thiagarajan S (1996). Instructional games, simulations, and role-plays. In Craig R (Ed.), The ASTD training and development handbook (4th ed.). New York: McGraw-Hill. [ Google Scholar ]
  • Thiel CE, Bagdasarov Z, Harkrider LN, Johnson JF, & Mumford MD (2012). Leader ethical decision-making in organizations: Strategies for sensemaking . Journal of Business Ethics , 107 , 49–64. 10.1007/s10551-012-1299-1. [ CrossRef ] [ Google Scholar ]
  • van Mook WNKA, van Luijk SJ, O'Sullivan H, Wass V, Harm Zwaveling J, Schuwirth LW, & van der Vleuten CP (2009). The concepts of professionalism and professional behaviour: Conflicts in both definition and learning outcomes . European Journal of International Medicine , 20 , e85–e89. 10.1016/j.ejim.2008.10.006. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Voelpel SC, Eckhoff RA, & Förster J (2008). David against goliath? Group size and bystander effects in virtual knowledge sharing . Human Relations , 61 , 271–295. [ Google Scholar ]
  • Weick KE, Sutcliffe KM, & Obstfeld D (2005). Organizing and the process of sensemaking . Organization Science , 16 , 409–421. 10.1287/orsc.1050.0133. [ CrossRef ] [ Google Scholar ]
  • Werhane PH (2008). Mental models, moral imagination and system thinking in the age of globalization . Journal of Business Ethics , 78 , 463–474. [ Google Scholar ]

IMAGES

  1. (PDF) Ethical Issues in Research

    research ethical problems

  2. Research Ethics & Misconduct: What Researchers Need to Know

    research ethical problems

  3. Ethical issues in longitudinal qualitative research for children and

    research ethical problems

  4. [PDF] Ethical issues in qualitative research: challenges and options

    research ethical problems

  5. PPT

    research ethical problems

  6. PPT

    research ethical problems

COMMENTS

  1. Ethical Issues in Research: Perceptions of Researchers, Research Ethics Board Members and Research Ethics Experts

    According to Sieber , ethical issues in research can be classified into five categories, related to: (a) communication with participants and the community, (b) acquisition and use of research data, (c) external influence on research, (d) risks and benefits of the research, and (e) selection and use of research theories and methods. Many of ...

  2. Ethical Considerations in Research

    Revised on May 9, 2024. Ethical considerations in research are a set of principles that guide your research designs and practices. Scientists and researchers must always adhere to a certain code of conduct when collecting data from people. The goals of human research often include understanding real-life phenomena, studying effective treatments ...

  3. What Is Ethics in Research and Why Is It Important?

    Education in research ethics is can help people get a better understanding of ethical standards, policies, and issues and improve ethical judgment and decision making. Many of the deviations that occur in research may occur because researchers simply do not know or have never thought seriously about some of the ethical norms of research.

  4. Ethics in scientific research: a lens into its importance, history, and

    Introduction. Ethics are a guiding principle that shapes the conduct of researchers. It influences both the process of discovery and the implications and applications of scientific findings 1.Ethical considerations in research include, but are not limited to, the management of data, the responsible use of resources, respect for human rights, the treatment of human and animal subjects, social ...

  5. Common Ethical Issues In Research And Publication

    The research proposal should discuss potential ethical issues pertaining to the research. The researchers should pay special attention to vulnerable subjects to avoid breech of ethical codes (e.g. children, prisoners, pregnant women, mentally challenged, educationally and economically disadvantaged). Patient information sheet should be given to ...

  6. Ethical Dilemmas in Qualitative Research: A Critical Literature Review

    To highlight ethical issues in research with people who use the mental health services: Authors' experience in mental health: Qualitative research with mental health service users is necessary, to heard their real needs. Research must guarantee the autonomy of those selected and offer protection to participants: McCormack D et al. Canada. 2012: V

  7. What Covid Has Taught the World about Ethics

    Research is a duty for health professionals and in the best interest of patients in times of a pandemic: Empirical exploration and ethical implications of the Research Ethics in Times of Pandemic ...

  8. Ethical Pitfalls in Research with Young People: How Can They Be

    The concept of 'ethics in practice' provides a useful framework for researchers when considering their role in responding to unexpected ethical challenges that arise from interactions with research participants, and it has been used by researchers in discussions about appropriate ethical behaviour in research on a wide range of subjects (e.g., Fletcher, 2022; McEvoy et al., 2017; Robinson ...

  9. Ethical Issues in Research

    Definition. Ethics is a set of standards, a code, or value system, worked out from human reason and experience, by which free human actions are determined as ultimately right or wrong, good, or evil. If acting agrees with these standards, it is ethical, otherwise unethical. Scientific research refers to a persistent exercise towards producing ...

  10. Research Ethics: Sage Journals

    Research Ethics is aimed at all readers and authors interested in ethical issues in the conduct of research, the regulation of research, the procedures and process of ethical review as well as broader ethical issues related to research … | View full journal description. This journal is a member of the Committee on Publication Ethics (COPE).

  11. Ensuring ethical standards and procedures for research with human beings

    It is important to adhere to ethical principles in order to protect the dignity, rights and welfare of research participants. As such, all research involving human beings should be reviewed by an ethics committee to ensure that the appropriate ethical standards are being upheld. Discussion of the ethical principles of beneficence, justice and ...

  12. Emerging issues in the responsible conduct of psychological science

    The responsible conduct of psychological research is critical to improving our understanding of developmental processes, creating effective treatments and informing public policy. At its most basic level, the study and practice of research ethics entails translating core ethical principles, standards, and ideals into effective and ethical research methods tailored to the characteristics of the ...

  13. Annual Review of Ethics Case Studies

    By providing a focus for discussion, cases help staff involved in research to define or refine their own standards, to appreciate alternative approaches to identifying and resolving ethical problems, and to develop skills for dealing with hard problems on their own. Research Ethics Cases for Use by the NIH Community. Theme 24 - Using AI in ...

  14. Five principles for research ethics

    4. Respect confidentiality and privacy. Upholding individuals' rights to confidentiality and privacy is a central tenet of every psychologist's work. However, many privacy issues are idiosyncratic to the research population, writes Susan Folkman, PhD, in "Ethics in Research with Human Participants" (APA, 2000).

  15. Research Ethics

    Multiple examples of unethical research studies conducted in the past throughout the world have cast a significant historical shadow on research involving human subjects. Examples include the Tuskegee Syphilis Study from 1932 to 1972, Nazi medical experimentation in the 1930s and 1940s, and research conducted at the Willowbrook State School in the 1950s and 1960s.[1] As the aftermath of these ...

  16. Ethical Considerations in Psychology Research

    The research team. There are examples of researchers being intimidated because of the line of research they are in. The institution in which the research is conducted. salso suggest there are 4 main ethical concerns when conducting SSR: The research question or hypothesis. The treatment of individual participants.

  17. Brief Report: Ethical Problems in Research Practice

    Brief Report: Ethical Problems in Research Practice. sity, SwedenABSTRACT: most accounts of the ethicalproblems facing researchers across a broad spectrum of research fields come from ethicists, ethics commit-tees, and specialists. committed to the study of ethics in human research. In contrast, this study reports on the ethical questions that ...

  18. (PDF) Ethical Issues in Research

    Ethics or moral philosophy is a branch of philos-. ophy with standards or codes or value systems. and involves defending, systematizing, recommending concepts of right, and minimizing. wrong ...

  19. Ethical Considerations for Qualitative Research Methods During the

    Qualitative modes of inquiry are especially valuable for understanding and promoting health and well-being, and mitigating risk, among populations most vulnerable in the pandemic (Teti et al., 2020).However, the implementation of qualitative studies, as with any social research (Doerr & Wagner, 2020), demands careful planning and continuous evaluation in the context of research ethics in a ...

  20. Making ethical judgement calls about qualitative social media research

    Ethical principles for making judgement calls about sensitive digital data. The ethical dilemmas addressed in this article are grounded in the seminal Belmont Report's (Citation 1979) 'basic ethical principles': respect for persons, beneficence, and justice.Respect for persons and justice underpin both the requirement for informed consent and to properly acknowledge intellectual property.

  21. Recognizing the ethical implications of stem cell research: A call for

    The ethical implications of stem cell research are often described in terms of risks, side effects, safety, and therapeutic value, which are examples of so-called hard impacts. Hard impacts are typically measurable and quantifiable. To understand the broader spectrum of ethical implications of stem cell research on science and society, it is ...

  22. Pig hearts in people: Xenotransplantation's long history, current

    Pig hearts in people: Xenotransplantation's long history, current promise, and the ethical use of brain-dead people in research. Tiny Matters August 21, 2024. ... Alva Capuano had dealt with so many health issues over the course of her life that really epitomized the need for xenotransplant research. ...

  23. Ethical Issues in Research: Perceptions of Researchers, Research Ethics

    Research Design. A qualitative research approach involving individual semi-structured interviews was used to systematically document ethical issues (De Poy & Gitlin, 2010; Hammell et al., 2000).Specifically, a descriptive phenomenological approach inspired by the philosophy of Husserl was used (Husserl, 1970, 1999), as it is recommended for documenting the perceptions of ethical issues raised ...

  24. Understanding Specimen Sharing Ethics: Informed Consent, History, and

    Understanding these issues is crucial for maintaining ethical standards in research. Commercialization Of Specimens. The HeLa cells controversy exemplifies the ethical dilemmas surrounding the commercialization of human biospecimens. Derived from Henrietta Lacks, these cells were utilized widely in research without her family's consent.

  25. Researching the promise, perils and ethical dimensions of technology's

    Arts & Sciences. Researching the promise, perils and ethical dimensions of technology's relationship with mental health. From developing peer support apps to investigating social media users' perceptions of mental health and offering ethical leadership about privacy concerns, faculty and students in the Klingler College of Arts and Sciences are finding insights at this intersection.

  26. Ethical challenges of researchers in qualitative studies: the necessity

    As Brenner quoted Kvale state that, preparing an ethical protocol can cover issues in a qualitative research project from planning through reporting . Data gathering and data analysis In qualitative research, data are collected with a focus on multifaceted interviews and narratives to produce a description of the experiences.

  27. Ethical considerations of children's involvement in school-based

    The study aimed to illustrate which ethical issues are prioritised within university ethics guidance for school-based research involving children, with a specific focus on ways in which principles relating to Article 12 of the UNCRC are reflected in these guidance documents. ... Although historically research ethics has emphasised participants ...

  28. Ethical Guidelines, Emerging Regulations in Octopus Research

    Health issues, such as parasitic and bacterial infections, are common in captive octopuses, and addressing these concerns adds another layer of complexity to their care. ... As the healthcare technology industry evolves, striking a balance between scientific advancements and ethical treatment of research subjects, including octopuses, remains ...

  29. Navigating Complex, Ethical Problems in Professional Life: a Guide to

    An ethics instructor's guide to sensemaking as a framework for case-based learning (Vol. 1): Office of Research Integrity. [Google Scholar] Antes AL (2014). A systematic approach to instruction in research ethics. Accountability in Research: Policies and Quality Assurance, 21, 50-67. 10.1080/08989621.2013.822269