Jump to content
Andrew Chen

Virtual Town Hall with Os Steward and Lique Coolen, Thu 9/19 @ 1pm EDT

Recommended Posts

Andrew Chen

Join Os Steward and Lique Coolen – leaders of SfN’s Foundations of Rigorous Neuroscience Research (FRN) program for a live virtual town hall, September 19, 2019 from 1-2 pm EDT. Drs. Steward and Coolen will facilitate an online discussion opportunity for you to share your experiences and ideas related to enhancing a research culture that supports scientific rigor. They can also answer your questions about FRN and collect your suggestions about what topics and speakers the training program should include.

Reply to this topic below with your questions either before the town hall or during the event for Os and Lique to answer!

Meet the FRN program’s Co-PI’s:

Oswald Steward, PhD

Oswald Steward.jpg

Oswald Steward is founding director of the Reeve-Irvine Research Center for Spinal Cord Injury at the University of California, Irvine. He is Reeve-Irvine Professor of Anatomy and Neurobiology and holds additional joint appointments in the departments of neurobiology and behavior and neurosurgery. His research focuses on how neurons create and maintain their connections, how synapses are modified by experience and injury, and the role of genes in neuronal regeneration, growth, and function. He received his PhD in psychobiology/neuroscience from the University of California, Irvine.

Lique Coolen, PhD

Lique Coolen.jpg

Lique Coolen is a professor of biological sciences and associate dean of the College of Arts and Sciences at Kent State University. Most recently, she was a professor of physiology and neurobiology and associate dean for postdoctoral studies at the University of Mississippi Medical Center. Her research focuses on the neural basis for social behavior and endocrine function, as well as spinal reflexes with the goal to develop new treatments for spinal cord injury. Coolen earned her PhD in neuroscience from the University of Nijmegen, in the Netherlands.

We are evaluating this program at every step and rely on your input and feedback to improve! At the end of the Live Chat, please fill out the survey below.

Share this post


Link to post
Share on other sites
Gabriella Panuccio

Hi there,

First of all, I want to congratulate with you for this fundamental initiative. At a time when publish or perish has become some sort of mantra and research funding is harder and harder to acquire, I sadly feel that overlooking scientific rigor is in a way a natural consequence of our time. Many scientists are lead to believe that publishing the big-statement paper is the key to acquiring funding, securing a position and standing out in the scientific community. There is already a big debate about impact factors, h-index and other indicators and it is hard to change this evaluation mindset; I find it kind of rooted in the scientific community.

I believe that the major moral responsibility for creating a culture of scientific rigor lays in the hands of supervisors and their role as mentors. I believe that the most effective measure to enforce scientific rigor is to educate our students to pursue it. As scientists, our first moral responsibility is searching for the truth. Unfortunately, I have known of supervisors being overly pushy and demanding, to the point that students felt forced to tweak their data; I have also known of supervisors who taught their students to disregard inconvenient data or even add fabricated preliminary findings in their grant applications to make them more convincing. Such situations are rarely brought to attention or, if disclosed, they tend to be overlooked by institutions for the sake of their academic prestige. I believe that human ego plays a significant role in this.

I find that collaborative research and open research data sharing facilitates honesty and rigor, but these measures can't always be put in practice, be it because of individual grants or because of confidential data that can't be openly shared.

Maybe research institutions and universities should enforce an internal peer-review system of data quality, evaluating the labs periodically. Any thoughts?

  • Like 2

Share this post


Link to post
Share on other sites
Oswald Steward

Hello, 

We (Lique Coolen and Os Steward) are pleased to co-host today's virtual town hall to launch the activities of the new SFN program “Foundations of Rigorous Neuroscience Research”. The purpose of today's chat is to begin to engage with you to begin to identify general areas where we, as members of our professional society, can work together to create positive change that will enhance scientific rigor going forward. We invite you to share structural challenges related to scientific rigor that you have encountered, solutions you have identified to address these challenges, the resources you have found useful, and also identify where resources are lacking.

To keep the conversation broadly focused and constructive, we want today’s discussion to focus on general challenges.  Any individual issues should be discussed in other settings that provide adequate time to fully analyze the situation in question. Please review SfN’s Community Guidelines. If what you would like to share includes identifiable information about a person, journal, or specific article, please share your experience in this anonymous form.

With that, we invite you to share your questions, comments, and suggestions.

 

Share this post


Link to post
Share on other sites
Oswald Steward
10 hours ago, Gabriella Panuccio said:

Hi there,

First of all, I want to congratulate with you for this fundamental initiative. At a time when publish or perish has become some sort of mantra and research funding is harder and harder to acquire, I sadly feel that overlooking scientific rigor is in a way a natural consequence of our time. Many scientists are lead to believe that publishing the big-statement paper is the key to acquiring funding, securing a position and standing out in the scientific community. There is already a big debate about impact factors, h-index and other indicators and it is hard to change this evaluation mindset; I find it kind of rooted in the scientific community.

I believe that the major moral responsibility for creating a culture of scientific rigor lays in the hands of supervisors and their role as mentors. I believe that the most effective measure to enforce scientific rigor is to educate our students to pursue it. As scientists, our first moral responsibility is searching for the truth. Unfortunately, I have known of supervisors being overly pushy and demanding, to the point that students felt forced to tweak their data; I have also known of supervisors who taught their students to disregard inconvenient data or even add fabricated preliminary findings in their grant applications to make them more convincing. Such situations are rarely brought to attention or, if disclosed, they tend to be overlooked by institutions for the sake of their academic prestige. I believe that human ego plays a significant role in this.

I find that collaborative research and open research data sharing facilitates honesty and rigor, but these measures can't always be put in practice, be it because of individual grants or because of confidential data that can't be openly shared.

Maybe research institutions and universities should enforce an internal peer-review system of data quality, evaluating the labs periodically. Any thoughts?

Hi Gabriella,

 

Thank you for your comments.  The idea of internal peer review is interesting.  Do you know of any institutions where this occurs?

Share this post


Link to post
Share on other sites
Lique Coolen
1 minute ago, Oswald Steward said:

Hi Gabriella,

 

Thank you for your comments.  The idea of internal peer review is interesting.  Do you know of any institutions where this occurs?

Hi Gabriella, Thank you for your thoughts and input. You bring up some really important issues. What do you think contributes to or motivates this behavior in supervisors or mentors?

Share this post


Link to post
Share on other sites
Lique Coolen

We encounter more negative results than positive ones in research, and yet publishing them is not common. Have you tried to publish negative results? Did you encounter any pushbacks from journals? What were your solutions?

Share this post


Link to post
Share on other sites
Guest 1c99d...b41

I've had this experience trying to publish negative data as part of a larger study. It got negative feedback from reviewers for not supporting the overall "story," and our solution was to leave it out in a re-submission, which wasn't satisfying. 

Anonymous poster hash: 1c99d...b41

Share this post


Link to post
Share on other sites
Guest P. Thorb
7 minutes ago, Lique Coolen said:

We encounter more negative results than positive ones in research, and yet publishing them is not common. Have you tried to publish negative results? Did you encounter any pushbacks from journals? What were your solutions?

Sometimes, the pushbacks come from the PIs. As a grad student, my PI has refused to include some graphs, even as supplementary data, because it "didn't add anything to the story". How can we counter that?

Share this post


Link to post
Share on other sites
Guest Eleanor
8 minutes ago, Lique Coolen said:

We encounter more negative results than positive ones in research, and yet publishing them is not common. Have you tried to publish negative results? Did you encounter any pushbacks from journals? What were your solutions?

It's such an individual reviewer response, how much control do journals have over that? 

Share this post


Link to post
Share on other sites
T Celeste Napier

Research is often supported by tax dollars.  Thus, confidence in the accuracy of data collection is an important aspect of translating scientific rigor to lay persons and policy makers, and thus the ultimate impact that science can have on society.  I would be interested in hearing views on how we can enhance communication of scientific rigor (as well as in-partiality in data interpretation) to the non-scientist.

celeste in chicago

  • Like 2

Share this post


Link to post
Share on other sites
Lique Coolen
3 minutes ago, Guest P. Thorb said:

Sometimes, the pushbacks come from the PIs. As a grad student, my PI has refused to include some graphs, even as supplementary data, because it "didn't add anything to the story". How can we counter that?

Thank you for your input. Do you think the pressure to publish is a factor? and if so, what solutions can we think of to lessen that pressure?

Share this post


Link to post
Share on other sites
Oswald Steward
7 minutes ago, Guest 1c99d...b41 said:

I've had this experience trying to publish negative data as part of a larger study. It got negative feedback from reviewers for not supporting the overall "story," and our solution was to leave it out in a re-submission, which wasn't satisfying. 

Anonymous poster hash: 1c99d...b41

All of you have raised important issues related to reviewers and PI's.  Do you think bioRXiv could be a place for negative data?  Or does there need to be some other open repository?

Share this post


Link to post
Share on other sites
Oswald Steward
7 minutes ago, T Celeste Napier said:

Research is often supported by tax dollars.  Thus, confidence in the accuracy of data collection is an important aspect of translating scientific rigor to lay persons and policy makers, and thus the ultimate impact that science can have on society.  I would be interested in hearing views on how we can enhance communication of scientific rigor (as well as in-partiality in data interpretation) to the non-scientist.

celeste in chicago

This is a really important issue, which also relates to how to discuss failures to replicate with non-scientists.  For many scientists, there may be limited options for communication with lay audiences.  Is this something that we, as a community, could improve on and if so, how?

Share this post


Link to post
Share on other sites
Guest anon - MK

How does diversity in the academic community affect or intersect with scientific rigor? Thanks!

Share this post


Link to post
Share on other sites
Lique Coolen

I am very interested to hear from all of you about what actions you have taken, at your career stage, to implement a culture of scientific rigor in the labs you have worked in?

Share this post


Link to post
Share on other sites
Oswald Steward
3 minutes ago, Guest anon - MK said:

How does diversity in the academic community affect or intersect with scientific rigor? Thanks!

I personally don't know how to answer this.  Independently, these are extremely important issues related to the culture of science.  But how they might intersect isn't clear to me.  Do you have ideas of how we might begin to gather information to answer this question?

Share this post


Link to post
Share on other sites
T Celeste Napier

Perhaps 'failures to replicate" could be re-couched to something like "expanding the knowledge base per different outcomes".  Then we can stress the value of repeating studies conducted by others, and to assure the non-scientist that this is inherent to rigor in the scientific process.  ... Different outcomes (aka failure to replicate) is NOT necessarily a bad thing

Share this post


Link to post
Share on other sites
Guest 1c99d...b41
4 minutes ago, Lique Coolen said:

I am very interested to hear from all of you about what actions you have taken, at your career stage, to implement a culture of scientific rigor in the labs you have worked in?

As a graduate student, I controlled  as many aspects of my experimental design that I considered necessary to be rigorous and confident in my conclusions. This included incorporating positive and negative controls in every experiment, blinding, diligent record keeping of experimental details, and online notebooks that others could see. 

Anonymous poster hash: 1c99d...b41

Share this post


Link to post
Share on other sites
Lique Coolen
4 minutes ago, T Celeste Napier said:

Perhaps 'failures to replicate" could be re-couched to something like "expanding the knowledge base per different outcomes".  Then we can stress the value of repeating studies conducted by others, and to assure the non-scientist that this is inherent to rigor in the scientific process.  ... Different outcomes (aka failure to replicate) is NOT necessarily a bad thing

This is a really great and important point. Some of the journals are indeed discussing or changing the language surrounding failure to replicate. This is a link to an eNeuro editorial to help with this discussion: https://www.eneuro.org/content/5/1/ENEURO.0042-18.2018

Share this post


Link to post
Share on other sites
T Celeste Napier

The eNeuro link was very useful - thank you!

Share this post


Link to post
Share on other sites
Guest anon - MK
5 minutes ago, Oswald Steward said:

I personally don't know how to answer this.  Independently, these are extremely important issues related to the culture of science.  But how they might intersect isn't clear to me.  Do you have ideas of how we might begin to gather information to answer this question?

Are there well-defined parameters of scientific rigor that could be used to assess whether something was/was not sufficiently rigorous? and then can we determine if there are patterns to which groups have a stronger culture of scientific rigor?  Spitballing here - I think there are many ways you could do it,  but you'd probably have to start with defining the terms.

Share this post


Link to post
Share on other sites
Guest Mary
7 minutes ago, Guest 1c99d...b41 said:

As a graduate student, I controlled  as many aspects of my experimental design that I considered necessary to be rigorous and confident in my conclusions. This included incorporating positive and negative controls in every experiment, blinding, diligent record keeping of experimental details, and online notebooks that others could see. 

Anonymous poster hash: 1c99d...b41

I would also add that having clear guidelines about outliers (the definition of them and what to do with them) is very important. Not all "inconvenient data point" are outliers!

Share this post


Link to post
Share on other sites
Lique Coolen
1 minute ago, Guest Mary said:

I would also add that having clear guidelines about outliers (the definition of them and what to do with them) is very important. Not all "inconvenient data point" are outliers!

Great comments. Which makes me think of use of statistics. We, as a scientific community, base our conclusions mainly on p-values. Some alternative and more accurate ways to interpret data has been suggested. For example, eNeurois encouraging the use of estimation statistics in publications. Do you think we need more statistical tools to better report, interpret, and discuss our data?

Share this post


Link to post
Share on other sites
Andrew Chen
15 minutes ago, Guest Sabah Ul-Hasan said:

To my knowledge, SfN is the only research conference to heavily consider and implement some solutions surrounding this topic. I would like to ask what additional steps are going to be taken in the upcoming conference, and ones that follow? This question also pertains to citizenship, and restrictions some may have with traveling as a result. As SfN is already leading the way (for which there is still much to be done, but at least it's something -- so, thank you!), to that end, in our current climate change crisis what will be our commitment as scientists to changing the paradigm of conferences (specifically travel/emissions) in the interest of environmental justice and practicing what we preach in the evidence we believe to be true (that humans are a major catalysis of climate change).

Hello Sabah! Thank you for your comments and questions; you raise a very important notion of how science is a global endeavor, but due to political factors, can be limited in its efficacy.

To address your concerns about unnecessary travel and its harmful environmental effects, I believe that SfN is attempting to bridge these gaps by organizing Virtual Conferences, which garner viewership from all over the world! There is definitely more to done, but I think we're taking the right steps forward.

Because this town hall is mainly related to the Foundations of Rigorous Neuroscience Research program, I'll direct your questions to the Science Knows No Borders forum, where SfN conference presenters who were denied a travel visa to the US can still share their research with other attendees. I'm sure they will have many thoughts about the issues you bring up and would love to hear your thoughts as well.

Thank you!

 

  • Like 1

Share this post


Link to post
Share on other sites
Oswald Steward
2 minutes ago, Guest Mary said:

I would also add that having clear guidelines about outliers (the definition of them and what to do with them) is very important. Not all "inconvenient data point" are outliers!

Excellent point!  In my personal opinion, it's important to report any exclusions and how that affected the outcome.

  • Like 1

Share this post


Link to post
Share on other sites

Join the conversation

You are posting as a guest. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...