One of the reasons I started this blog was to give pointers to young scientists who are trying to learn how to effectively convey their science to their audience.  I see many challenges that graduate students and post-docs face in this arena.  In fact, I believe that the inability to effectively convey one’s science is the biggest road block most of my trainees have faced. While that may not be a valid statistical sampling, I suspect that it represents a wider and deeper indication about the lack of preparation that many folks have to face the real-world needs to convince an “audience” that one’s scientific work is worthwhile.

I want to cover just one example of this today.

I am working on a book chapter project with a person in my lab, describing a new algorithm we developed. (I am intentionally vague here so as to not put that person on the spot).  One of the main components of this book chapter is going through an example of the program running on a data set, then discussing its output.  My co-author chose as the sole example of program operation one that had many pathological features.  He did this in order to illustrate where and how the program might fail.

There are several problems with this.  One, it is likely that this book chapter will be the first time that a reader will have heard about our algorithm.  If we tell them all the ways that it can fail as the first example of its operation, they’re going to go away thinking that it is junk.  Have you ever gone to buy a car and had the car salesman point out all the ways the car could fail or break down?  Never – because you would not ever buy a car from that person, and that person would very soon be out of a job.  While one would hope that scientists are a bit more circumspect than car salesmen, the underlying psychological principles are the same.

A second issue is that this is the only example provided in this chapter draft, so the only result the reader will see is this one that has a number of issues where the program failed.  However, the chapter is aimed at potential users of the algorithm.  Telling them how it is likely to fail doesn’t convey how they should maximize the use of the program on their own data.  Nor does it encourage them to do so.  It may be ok to include this as a second example, after a first one is shown that works well.  By showing the differences between proper operation and poor operation, we might expect the reader to learn more about the strengths and weaknesses.

As scientists, one of the biggest hurdles all of us face is convincing others that our work is worthwhile.  This problem is particularly acute for those of us who develop software as part of our science.  It is all to common to develop a piece of software, then have it just sit there, mostly unused.  That is a big waste of time and money.  I used to think if I developed great software, the world would come knock down the door looking for it.  That was naive. There are thousands of pieces of software out there, and many of them work poorly.  A potential user (e.g. biologist) could waste a tremendous amount of time trying them and attempting to get them to work, so most people don’t.  The biggest barrier we face as developers is getting people to even consider or try our software in the first place. This is followed by a second big barrier, which is convincing people to keep using the software, especially if it doesn’t work perfectly the first time out of the gate (what software ever works perfectly?).  To address these hurdles, it is our job as authors to convey emotions such as enthusiasm and excitement to the reader.  Only if we can get the reader sufficiently emotional about the software (in a good way) will they be likely to ever give it a shot, and persist through any problems encountered.

So if the first document that people see about our software shows more about how it can fail than how it can succeed, most people will never bother trying it.  Describing things that way does not engender positive emotion or enthusiasm, so they’ll just move onto the next thing.

In the case of this book chapter draft, the use of the negative example was exacerbated by the way it was described.  My co-author chose to leave the fact that the example had several pathologies as a “surprise”, meaning that it wasn’t until after the output of that example was presented, that the pathologies were discussed.  When I first read this part of the draft, I expected that I would see a good working example.  As I got into the text describing it, the deficiencies were then described matter-of-factly, as if they were just par for the course (read: normal behavior).  If the “normal behavior” is deficient operation, the reader’s enthusiasm is substantially quelled.  I probably wouldn’t try software that had been described like this.  I would move on.

This illustrates that how one sets up readers’ expectations are important. If my co-author had indicated from the outset that an example was intentionally chosen to illustrate some pathologies, and clearly indicated that this was not the usual, expected behavior, I might not have been so surprised (on the downside) when I encountered the actual output.

This is not by any means the first time I’ve encountered this type of problem in a student’s writing.  And it is not associated only with writing about software.  The problem can plague writing about any kind of science that one might do.  I believe it stems from prior training in undergraduate or graduate laboratory classes, where a student is asked told to do experiments then write them up to turn in for grading.  In my experience, the number one focus of such assignments is making sure that the student has accurately performed and represented his/her research. If the student overstates or misstates any results, he/she gets harshly penalized.

It is well and good to definitively teach students not to overstate or misstate their results, because to do so can be career destroying.  But in focusing on the negative, I believe many instructors overlook the nearly equal importance of teaching students to emphasize the positive in their work. That’s pretty hard to teach when the students are just doing the same old experiment that has been done by thousands of other students and replicates something originally discovered by someone now long-dead. There’s no room in such a context to teach the importance and power of conveying excitement and enthusiasm about positive results (while retaining a proper balance with realism).  Because of that, when students come to my lab, almost none of them seem to understand this crucial balancing act, and in fact it is often a long, slow road to teach them.  In my own career, I struggled with this point all through my graduate work, post-doc work, and even through my first years as an assistant professor.

In the case of our book chapter, my co-author could have presented a positive (but realistic) example, then either presented a second, more pathological example, or just discussed in the text what pathologies might occur in certain circumstances.  This approach would have the likely effect of more positively conveying the work, while realistically acknowledging its limitations.

So that concludes my first pointer, and it is a very important one: nobody is going to tout your work for you, you have to do it yourself, and it is a critical part of any type of communication you produce as a scientist.


    2 replies to "Conveying the value of your science"

    • Truden

      Greatings, What good words
      Thank you

    • James

      Morgan, thanks for the interesting post !

      One thing which definitely helped me in my grad student days was the fact that my advisor would insist that I submit a reaerch report at the end of every semester, written as a sort of mini paper. It proved to be most challenging whenever I had spent that semester troubleshhoting. In the end it was trying to explain what was going wrong and how I tried to design experiments to account for this that really helped me in subsequent manuscript writing.

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.