Chapters
Transcript
RICHARD SHARP: Good afternoon, everybody. I think we should go ahead and get started. It's nice to see so many folks here. I know as we get closer and closer to the end of the year, it's always harder to know how many folks will be able to attend. So I'm grateful that you're able to join us this afternoon.
I'm Richard Sharp, and it's my privilege to introduce our speaker today. Jennifer McCormick is an associate consultant in the departments of medicine and health sciences research. She received her doctorate degree in molecular and cellular biology, as well as a master's degree in public policy from the University of Michigan. Before joining Mayo Clinic in 2008, Dr. McCormick was a fellow with the Stanford Center for Biomedical Ethics.
Dr. McCormick lectures frequently on topics related to research ethics and directs Mayo Clinic's graduate school course on responsible conduct of research. She also is one of the primary consultants on the CCaTS Clinical and Translational Research Ethics Consultation Service, or ClinTRECS, and worked closely with our clinical investigators and the Institutional Review Board to enhance the quality of research protections. Dr. McCormick's scholarly interests focus on topics and research ethics, understandings of professionalism and science, and the scope of a scientist's social and ethical responsibilities.
She's published extensively on topics related to genetic research, data sharing, and challenges to protecting participant privacy. She works closely with Mayo's Center for Individualized Medicine and, in that capacity, is helping with the planning of its individualized medicine clinic. At a more personal level, Dr. McCormick is a wonderful colleague and friend, a terrific teacher, and just a very highly valued member of our group in biomedical ethics. And it really is a privilege to be able to introduce her to you. I hope that you enjoy her talk on professionalism and science, "What Do Scientists Think?"
JENNIFER MCCORMICK: Thanks, Rich, for that introduction. So I want to kind of dispense with a couple of slides. Disclosures-- there are no disclosures to make. And then to the learning objectives for today, I hope that you can walk away from this talk with being able to have-- being able to describe how scientists perceive professionalism, scientists perceived challenges to being professional in science, and some possible implications of these challenges. Before I really get going, though, I want to acknowledge two of close collaborators and colleagues on this project.
So we have Kate Nowakowski, who is a biomedical ethics research analyst in our group. And then Dr. Ashok Kumbamu, who is a principal analyst in the Center for Science and Health Care Delivery's qualitative research core. Really, this has been a joint project amongst all of us. I also want to acknowledge the source of funding, the Whitney and Betty MacMillan gift to the Mayo Clinic Program in Professionalism and Ethics, and then also a few other research analysts and summer undergrads had worked on this study over the years, and, of course, all our study participants.
So how do researchers talk about professionalism? That's kind of the question. And what I think you'll find is-- or what I hope, maybe, you'll find is if you had the opportunity to listen to Dr. Brian Martinson's talk back in September from CCaTS grand rounds, that some of the data that I'm presenting today from this qualitative study actually complement a couple of the points that Dr. Martinson was trying to make in his talk.
So as you'll recall, Dr. Martinson introduced us to Benjamin Sovacool's three approaches to understanding undesirable behavior in research and science. One of those, according to Sovacool, is to really kind of focus on individual's lack of integrity, or as Sovacool used the term, the impurity of the individual. This really reflects the bad apple narrative posited by Edward Hackett in 1994, where, again, really, the focus is on the individual. You just simply have bad people.
We hear about this all the time. I mean I think we think about it a lot in the context of research misconduct and fraud. So to remind you, research misconduct is really narrowly defined as a fabrication, falsification, and plagiarism. And this is just a recent article from two years ago.
An individual did a review of the different cases of scientific fraud and misconduct, and it's been cited a number of times. But basically, the bottom line is he says, "The percentage of scientific articles retracted because of fraud has increased approximately tenfold since 1975." And that is alarming. And really, if you look in Science magazine on a weekly basis or Nature, you often see little news blips about someone being accused of research misconduct.
So Sovacool also suggested two other approaches, and I do agree with Dr. Martinson in thinking that these really are complementary approaches, rather than at odds with each other. But the two that I think you'll find, perhaps, some of the data I'm presenting today really get at are the issues of institutional issues that may be threatening the integrity of research, as well as systematic issues or structural aspects within the research enterprise. This has been-- we're starting to hear more about this in the context of irreproducible research.
In 2012, Glenn Begley and Lee Ellis published a commentary in Nature, and they talked about building a stronger system. And they asked, what reasons underlie the publication of erroneous, selective, or irreproducible data? The academic system and peer review process tolerates and, perhaps, even inadvertently encourages such conduct. To obtain funding a job promotion or tenure, researchers need a strong publication record, often including a first authored high impact journal. And they go on.
And I think we're all kind of familiar with this. Later, just this past summer, Glenn Begley kind of really articulated a couple of points that are places, where the system could maybe focus on in terms of improving this issue of irreproducible research, and I think even probably more broadly, the issue of better integration of professionalism and responsibility into the research enterprise. The Economist also had an article about a year and a half ago, a year ago, kind of highlighting, again, in the commentary format, some of these issues.
So given this context and this background, my colleagues and I thought, why don't we ask researchers what they think about this notion of professionalism, which in our mind, Dr. Kumbamu and Kate, we were thinking of research that kind of-- professionalism and company, these broad notions of ethical conduct, responsible conduct, just really being careful and really trying to make the research enterprise as professional as possible.
So our research questions, specifically, were, how do researchers conceive professionalism in the context of their work? What are perceived challenges to professionalism in research, at least their perceived challenges? So we used a qualitative approach, specifically focus groups. We conducted nine focus groups with five to eight participants each. We actually did five groups with trainees, trainees being research fellows or graduate students, and four with faculty members. All were from the basic or translational biomedical sciences at Mayo Clinic. All participants were orally consented, so this was a study that was reviewed by the Institutional Review Board.
The focus groups for audio were recorded and transcribed, and it was these transcripts that we analyzed using standard qualitative methods. So what most of the data I'm presenting to you today are going to be qualitative data or quotes from our participants in which we did a systematic analysis. So some sample characteristics-- we had a really good representation from a broad range of the departments and tracks within the Mayo graduate school. We had 19 faculty, nine research fellows, and 19 graduate students participate across all nine focus groups.
So what do scientists think of professionalism? I mean that was really one of-- how we opened up our focus group discussion, asking them, how do you conceptualize professionalism? Overwhelmingly, we heard the word, "integrity," that integrity is key. Integrity to the research is key to professionalism and science.
We also heard individuals talk about objectivity and factual accuracy. This faculty member said, "I think objectivity and factual accuracy is part of it in relation to respect and how you conduct yourself in your research and personal life." Another faculty member talked about communicating things honestly, and basically, do the right thing to be sure we are getting our ideas across without deceiving people by inadvertently presenting things that are incorrect. So clearly, these ideas of being true to the data, true to the science was very important to our participants.
We also heard people talk about this idea of the interactions with people-- their labmates, their colleagues, et cetera. So this faculty member said, "Act professionally in the way that you handle yourself, the way that you interact with people, having a respectful attitude, listening to other people's opinions." So this idea of being respectful, whether you agree or not, was really key and occurring across all the focus groups.
Another research fellow pointed out that professionalism-- what it should be is a code of conduct, treating people with respect, and being that penultimate researcher, where people want to collaborate with you. They want to join your lab. They want to work with you, because you are respectful.
Graduate students had the same thing to say. I think this graduate student really had a broad vision of who should be respected and thinking about not just other scientists. This individual said, "I think it is important for scientists to remember that you are working with other people. You are working for other people. You are working for taxpayers. You're working for your PI. You are working for your institution.
It is just fundamentally treating other people like they are humans." So this even kind of extended the discussion to this idea that it's not just internally within the scientific community, but there's this idea that professionalism in science extends beyond just your colleagues within the scientific community.
So in that context, there was also a lot of talk about responsibility and professionalism. This graduate student pointed out that, yes, science is a profession, and there is a lot of responsibility behind it. To be professional, you need to fulfill those responsibilities.
This research fellow really explicitly stated that the idea that social responsibility comes with professionalism. So that, then, opened the door to the conversation about, what is the responsibility of science to society? How are scientists thinking about this? And that was quite a rich discussion, and I captured just a few of the highlights here. But this research fellow talked about it in the context of ensuring the data that are out there are good. Again, kind of this respect and responsibility to the data and the science.
I'll read the quote. "I think in terms of the overall arching picture, it's making sure that there is good data out there, making sure that scientists are checking up on other scientists to make sure that the data is valid. That is what I see as social responsibility."
Others took this notion of social responsibility beyond just within this scientific community. Here, this faculty member says, "We have a responsibility to the public in terms of projecting the complete picture. Someone makes a discovery. What does that really show, and what doesn't it show?" So this is really kind of getting at this issue of engaging and interacting with the public and thinking about responsibilities and professional-- again, thinking about responsibilities and professionalism beyond just within the scientific community.
This graduate student talked-- reflected on an experience that he had when he was in college. "When most people think of scientists, they are thinking of a lab coat and nerdy guy. In college, I spent a lot of time talking to high school students and younger college students about getting them interested in doing research. And when they met myself and other colleagues, not that I'm Mr. Cool, but I'm certainly not the prototype scientist that they expected. And I think that engaging the public is part of professionalism." So again, really interacting and talking about the excitement and what we do as scientists.
One faculty member-- and this was not just-- this was not the only individual who talked about this, but this idea that it's fundamentally taxpayers, the public, who are paying for our research and that we have a responsibility to them. And part of that responsibility to them is telling them what we do and why we do it and why science-- why research is important and that we actually appreciate the funding they had. This faculty member talks about, "Taxpayers paid for what we did. They deserve to understand what interesting careers we have and why it's such a joy to do this job and justify it."
So that all said, it's clear that our participants really thought that professionalism and all the things that kind of fall under it are critical, but they also were clear to point out there are challenges, at least challenges that they perceive. And so I'm going to kind of go through some of these. And I think some of these, many of you, if not all of you, will probably nod your head and say, yeah, that I can relate to that.
So it was interesting. I think this idea of attention kind of between needing to be objective and true to the science, but our own personal interests within the scientific community and the culture of science kind of highlighted an underlying theme. It was articulated well by this faculty member, who said, "We try to set aside our personal interests and serve this community. It is a peculiar thing that because it is asking people to be dispassionate and set aside self-interest, which is hard for humans to do."
So when you're thinking about-- when we're thinking about this, that self-interest as researchers is advancing in our career and meeting all the requirements that are necessary to do that, while at the same time, still serving the community, serving the scientific community, and serving the broader community, doing all these things that they were talking about-- engaging with the public, being socially responsible, et cetera, et cetera.
So this idea of survival was really something that we heard a lot about. This faculty member noted, "I think the question is, does the scientific community right now promote professionalism?" His answer was, "I would say as a generality, no." The individual went on to say, "The scientific community right now, as I see it, being a young investigator, it promotes survival of the fittest for the most ruthless or the fastest or the biggest lab." Now, again, this is a perception that these people had, but it was common across all nine focus groups, whether it was faculty, fellows, or graduate students.
This faculty member pointed out, "We're always struggling to pay the bills, and if you're struggling to pay the bills, it's much more difficult to be very selfless. And that is a particular problem for junior investigators that they don't have a lot of resources, and you know yet, there are milestones that are put on them or projected at them that they have to meet. The fundamental choice often becomes, how do I get from point A to point B and not kill myself," meaning, then, all these other aspects of responsibility, professionalism, responsible conduct, engaging with the public-- some of those things can easily, then, slip through the cracks or take a lesser priority in the scheme of daily activities.
There was also this-- it was interesting, because we had talked a lot about, in these focus groups, this idea of engaging with the public, being socially responsible. But our participants, some of them were actually quick to point out, well, what's in it for me? How is that going to help me advance in my career?
This fellow said, "I think there would have to be a value placed on the professionalism in order to make it really worthwhile, and I don't want to sound shallow, but, indeed, if I'm going to invest time in this, I better get something more than just a pat on the back."
Faculty member made a suggestion, actually. Whether it's a reality or not, a possibility, I don't know, but this faculty member asked the question, "Is the real solution just to have the institution say, hey, you know what? We'll cover 5% of your time, and you have to spend 5% of your time doing outreach. At least that way, the institution sends the message as we view this as something important, instead of the reality being, if you want to get ahead in the institution, the best way to do it is to get one more R01."
So again, kind of going back to that tension that I highlighted at the beginning of the section on challenges, there was a lot of that. And under that came themes of survival, survival of the fittest, and perceived value. Also, this idea, the paradigm that we're most familiar with, and that is publish or perish. This research fellow pointed out, "We are so pressured on the publish or perish aspect that if it is not honored or respected within that vein, it's going to be very difficult to get a junior scientist to be professional in that mindset."
So again, thinking about this idea of perceived value, and what's in it for me? And this also is kind of alluding to-- I think if we think back to some of the introductory remarks that I made about irreproducible data-- this is probably some of the things that may be leading to some of that. So again, more on the publish or perish issue-- this idea that you have to publish before someone else, and cut a corner here, or cut a corner there to make it happen.
So I'd like to suggest that, perhaps, there may be some implications of these challenges. As one faculty member said, "We are not fostering a good environment, at least as they perceived it." Irreproducible research, I think, is possibly one implication that we're seeing as a result of these challenges that scientists perceive they face. As noted here, "I would rather have a lifetime of things that other groups can replicate, instead of something that is retracted or other groups can't replicate. It is a very slippery slope, and once you're in that boat of individuals that have not been able to have something out of your lab replicated, then that can be a very significant problem."
So there was recognition that this is a problem, and it does not do justice to the science. It's not being respectful of the science or our funders or the taxpayers. But there is this tension, and going back to the earlier quote about this desire, this need, this pressure to feel that you need to get data out there on the publish or perish paradigm. I think, perhaps, that some of these challenges may be leading to this issue of quantity over quality.
This research fellow said, "You have to consider also the system in which we are working. It pushes some people to have to make the choice of, do I wait and replicate and replicate? Or do I push it out so that I have five new publications in 2011 so that when I put in my next R01, they're going to say, oh, I guess he hasn't been sitting around on his thumbs. That's good."
So again, kind of hinting at this might be some of the underlying reason of why we're seeing this issue of inability to reproduce research results, seeing what we are terming as sloppy science in the literature, because the perception of our participants was that there's this push to have quantity over quality. It's quantity. It's the numbers that count, rather than the quality of the research.
Clearly, a lot of them talked about work under pressure, and it was interesting. This research fellow said, "Unfortunately, when the pressure is too high, unless you're just an extremely ethical, moral person and completely on top of everything you're doing, there are more and more things that can potentially fall through the cracks." And I think this kind of gets back at some of the earlier data I've presented, where you have to make choices. Do you do this, or do you do this? Given the pressures of the system, do you go this way, or do you do that, go that way?
Something that was really highlighted and kind of surprising to my colleagues and I was this issue of mistrust. And it may be that the challenges that scientists are facing that we've put in the context of challenges of being professional, but I think just challenges of being in the scientific community, being a part of the research enterprise is mistrust. This is where a graduate student is talking about abstracts at an American Heart Association.
And she says, "It is very clear, all abstracts cannot be published prior to the meeting. You go there, and invariably, you find out abstracts all over the place, almost a third nowadays that are already published, because people are so paranoid that they're going to show up with their abstracts, and somebody else is going to see it, and they're going to run back to their lab and call back to their lab and have them start some experiments."
So again, this idea of mistrust, which I think actually leads to the broader issue of keeping things closer to your chest and not really openly sharing and communicating with each other is something that was highlighted by not just graduate students, but also faculty members. So one had an example. A specific example might be someone, who might be on a review committee for a very important paper, say Nature Medicine or Nature Genetics, where they deliberately held up a paper that they are reviewing so that they would get their own study done and published first. It's happened, and this is a specific example. And indeed, there were a lot of people in that focus group, who were nodding their heads and saying, yes, I suspect that this has happened to me.
At the end of this discussion, the point was made that this is one place, where professionalism has failed. It's failed in ensuring that the trust within the scientific community, and I would suggest even maybe beyond the scientific community, has been compromised. And I think that's not potentially a good thing for the conduct of research.
So this faculty member, I think, asked a really good question. No one had a very good answer to it, but I think I would like to pose it here and see what you folks think. "What's concerning is if the survival of the individuals breeds the destruction of the enterprise, then it is not worth surviving, right?"
So indirectly, I think what he's saying, suggesting is that if we're breeding, or if we've created a system in which individuals are focused on only themselves, and it's potentially destructive to the enterprise, which many people were saying within our focus groups, then maybe we need to step back and ask, is it worth surviving or, alternatively, perhaps, reevaluating the system in which we do research?
So what can we do? You know, we have a couple slides here of what our participants noted in the context of what we can do. One, there was a lot of talk about mentoring, where mentoring was really key and that it was actually a responsibility-- not just faculty members mentoring, but graduate students, senior graduate students mentoring junior graduate students, et cetera.
So this faculty member said, "Where we can directly influence outcomes, I'm thinking in particular, is our training of our students and our post-docs. And that training that we give them-- is it for their benefit, or is it for our benefit? What's the fine line here? Are we giving them the best training that we can? I think that we have a social responsibility that we have a much greater control of, and the effort that is required on your part could be even much larger than the effort to influence some global or cosmic outcome."
So this idea that what this individual was asking his colleagues in the focus group was, when you're mentoring your graduate students or your fellows, it's important to step back and ask, am I doing this for my benefit or their benefit? And his response was that it should be for their benefit, which indirectly is for the benefit of the larger scientific community.
Courses-- so there was overwhelming agreement that more training for scientists to deal with ethical and societal implications of their research would be useful to scientists. So that's the one thing that we can do. It's pretty tangible, but I think there's probably problems with that as well. What kinds of courses? How does that compromise the training that graduate students have? Is that within the scope of what graduate students and fellows have in the context of their training to think more broadly about societal implications and ethics?
So in summary, just some of the highlights from the bit of data that I've shared. I think overwhelmingly, we heard this idea that integrity was really critical to the success of the research enterprise. As this faculty said, and I think it really sits home, it hits home hard, at least to me, "Integrity takes a lifetime to build and a second to lose." And I think that's really important for us to remember, and it's not integrity just within the scientific community, but integrity within the public and maintaining the trust of the public.
So the themes that we heard-- professionalism. Overwhelmingly, we heard people say that yes, research science is a profession, and yes, scientists should be professional. Professionalism includes being objective and honest, respectful, respectful to your colleagues, respectful to the public, respectful to research participants. There's a responsibility, a responsibility within the scientific community, but also a social responsibility. And this includes engaging with the public in whatever mechanism is appropriate.
There's also this idea of ethical conduct and the importance of just thinking about the basic ethics of the work you do. No doubt, there were challenges, and actually, when I think back to the focus groups, much of the discussion was around these challenges. Survival of the fittest was really highlighted a lot. Perceived value-- this idea of the publish and perish leading to all of this, leading to some of the implications of pressure, quantity over quality, mistrust, and the idea that there is this tension that wanting to be objective and true to the science, but yet you have your own personal interests that you need to pursue.
So some of the suggestions we heard were mentoring or course work, and I think that-- personally, I think that those are probably good steps but may not be the only thing that can be done. And I want to take just to kind of a side note, Dr. Kumbamu and I actually, in collaboration with CCaTS Education Resource, actually tried to do something. So in the summer of 2013, and then, again, this past summer of 2014, Dr. Kumbamu and I actually offered a course to the CTSC track in the graduate school called Science Beyond the Laboratory-- the Intersection of Science, Society, and Policy.
And I put this up here just to show you that-- and I will admit I'm-- well, I'm very pleased, but I'm also pleasantly surprised as well that we had people, who felt that they had learned a lot or an incredible amount and that individuals in our class have made statements that, very glad I took the course, grew up intellectually, very informative. I talked about things that I would have never talked about before, broadened my experience.
Another individual said that, "Extensive discussions encourage students to engage and think about the big issues and reflect on their own experiences. I learned a lot from this course. It was an excellent experience. I think it has a lot of potential, and I would highly recommend it to other students."
So one way of looking at me putting this up here is to kind of tout my own course, which yes, that might be an indirect means of this. But I think more importantly, what I want to say is what we heard people say is that, wow, we did learn a lot, and we really appreciated the opportunity to engage in thoughtful, deliberative discussions about these issues. Now, recognizing the challenges that I highlight in some of the data in terms of, we're put into this survival of the fittest, the publish or perish. How do you allocate time for this, I think is a big challenge, and how institutions can enable that is something to consider.
So in conclusion, the discussions around ethics, professionalism, and social responsibility ought to go beyond the implementation of protocols and laboratories and institutions. Professionalism involves not only knowledge and expertise, but also the virtues of trustworthiness and altruism. The survival of the fittest model in the contemporary culture of science has negative implications on the research enterprise, one of these being potential loss of trust within and outside the scientific community.
So some final thoughts-- as you may recall from Dr. Martinson's talk, he asked the question, is it sufficient to consider the motives of these individuals-- well, these individuals being those who are accused of misconduct, while ignoring the broader context? He asserted, no. I would like to suggest that our data that we've presented provides some support for that notion, and my colleagues and I also would say no to this question.
So the final question that I think we need to ask ourselves-- and I will be quite frank. I don't know the answer. But we should ask ourselves, I think, are there measures institutions can take to better promote professionalism and social responsibility? If so, what are they, and how can we effectively implement them? Thanks for listening to me, and I welcome any questions-- try to answer any questions you may have.
Jen McCormick, Ph.D., presents Professionalism in Science: What Do Scientists Think?
Related Videos