Saturday, February 1, 2020

More about the "discovery" of stem cells

This post isn't specifically about cancer stem cells. It`s a response to a recent tweet and an earlier blog post about stem cells by Paul Knoepfler (@pknoepfler).

Knoepfler, on January 30, 2020, posted an ICYMI (In Case You Missed It) on Twitter. The ICYMI pointed to a blog post of his, dated April 11, 2012, entitled "Who really discovered stem cells? The history you need to know".  In this post he questioned claims made by others that Ernest McCulloch and I discovered stem cells.

I didn't respond to the original 2012 post, and the comments section has now closed. Replies about the ICYMI tweet itself are constrained by Twitter`s limits on character count. Hence this blog-based response.

There were three main reasons why I didn't respond to the original post by Knoepfler in 2012. Firstly, I had no disagreement with most of the historical content of his post (except for some of his comments about our own work). Secondly, Ernest McCulloch wasn't alive (obituary in The Lancet) to contribute to our customary joint response. Thirdly, one major point that I would have made in any response had already been well-expressed by Lisa Willemse (@WillemseLA).

Lisa pointed out, in this excerpt from her comment on the original post by Knoepfler, "The term “discovery” is not one they would have chosen to describe their work, yet it has been given to them for the very reasons you mention here – a desire to have scientific heros, to claim “firsts”, and our (and the media’s) preference for absolutes".

Ernest McCulloch was a hematologist. He knew that the concept of stem cells of the blood-forming system had existed in the literature for many years. The concept was so well known that it was included in textbooks for medical students, such as Ham's Histology. A review, in 1954, of the 2nd edition of Arthur Ham's book is available here [click on the PDF icon].

Ernest McCulloch was also aware that many attempts had been made to identify individual stem cells using histological techniques. An example was provided by the work of Clermont and Leblond, such as a paper of theirs published in 1953, "Renewal of spermatogonia in the rat". Ernest believed that such techniques, when applied to the search for stem cells of the blood-forming system, could not yield unequivocal results.

Knoepfer, in his original blog post, noted that he found "it very curiously puzzling that Till and McCulloch did not employ the name “stem cells” ..." in a paper published in Nature [1963(2 Feb); 197(4866): 452-454]. A copy of this paper is openly accessible [click on the PDF icon] via J. Immunol. [2014 Jun 1;192(11):4945-7]. The first author of this paper was the excellent experimentalist Andy Becker (tribute), a medical graduate who had undertaken a Ph.D. program within our research group.

The reason that we didn't use the term "stem cells" in this paper was because we were not yet convinced that the cells we were studying were actually stem cells. It's only after further work, published in a less prominent journal, that we became convinced that we were, indeed, dealing with cells that had the properties expected of stem cells. This paper was entitled The distribution of colony-forming cells among spleen colonies, Journal of Cellular and Comparative Physiology [1963(Dec); 62(3): 327-336], It's openly accessible via the TSpace repository of the University of Toronto Libraries [PDF]. The need for stem cells to be capable of self-renewal as a crucial component of a definition of stem cells was emphasized in this paper.The first author of this paper, and the initiator of the work described in it, was Lou Siminovitch, a valued mentor during the early stages of my research career.

Why has the later (Dec. 1963) paper in the Journal of Cellular and Comparative Physiology received less attention than the earlier (Feb. 1963) paper in Nature? I think it's obvious that Nature's high Journal Impact Factor (JIF) provides a credible basis for the difference.The JIF continues to be used to assess the quality of individual publications, even though it has been questioned whether their use has been good for science. See, for example: The Journal Impact Factor: a brief history, critique, and discussion of adverse effects, arXiv preprint [arXiv:1801.08992, 201, PDF]. It should be noted that Eugene Garfield first mentioned the idea of an impact factor in 1955, but the Science Citation Index was first published in 1961. See: The History and Meaning of the Journal Impact Factor, Garfield E, JAMA, [January 4, 2006—Vol 295, No. 1 (PDF, Reprinted)]. Garfield responded to critics of the JIF in this article. It is noteworthy that our papers were published at a time when the JIF was just beginning to be developed. It didn't have the cachet then that it still does now.

Our early work related to stem cells of the murine blood-forming system was reported in three papers. The first was published in Radiation Research [February 1961, Vol. 14, No. 2, pp. 213-222] and entitled: A direct measurement of the radiation sensitivity of normal mouse bone marrow cells. The full text of this paper is openly accessible [PDF] via the TSpace repository at the University of Toronto Libraries, at: http://hdl.handle.net/1807/2781. The second was the paper in Nature [1963(2 Feb); 197(4866): 452-454] referred to above. The third was the paper in the Journal of Cellular and Comparative Physiology [1963(Dec); 62(3): 327-336] also referred to above. These three papers form a package. As noted above, it was only in the third, in 1963, that we began to use the term "stem cells".

Although I was somewhat taken aback by Paul Knopfler's apparent unawareness of the full package of three papers described above, I laud him for his exemplary accomplishments, especially as a blogger and advocate. In particular, I'd point to his efforts as a voice of caution in relation to unproven stem cell treatments. See, for example: The rise of unproven stem cell therapies turned this obscure scientist into an industry watchdog, by Kelley Servick, Science Aug. 3, 2017.

Monday, December 30, 2019

Examples of Research on CSCs during 2019

There's a section on Cancer Stem Cells in Nature.
The current subsection on Latest Research and Reviews lists articles published in December, 2019. Some these are openly-accessible and some are not.
Another current subsection, on News and Comment, lists more articles published during 2019.
All of these articles have been published in Nature Research journals. Nature Research is part of Springer Nature.

Monday, December 31, 2018

A Decade on Twitter

I joined Twitter in August of 2008. My emphasis has been mainly on #cancerSC and #OpenAccess. For several years, I've used Twitter, rather than this blog, as a means for providing information about selected recent research on cancer stem cells. How much attention has been paid to these tweets?

I currently have 965 followers on Twitter. The recently-acquired followers appear to be either scientists, or people who have no obvious interest in either #cancerSC or #OpenAccess. Is 965 a satisfactory number of followers? Neither of the main foci of my Twitter posts can be regarded as mainstream topics, so my modest target has been 1000 followers. The total is almost there (after a decade!). A substantial proportion of those followers who identify themselves as scientists mention an interest in stem cells and/or regenerative medicine. Fewer reveal any interest in Open Access. I cannot provide any useful quantitative information about the individual interests of all 965 followers, because of major uncertainties about how best to classify their interests.

Perhaps the attention paid to individual tweets might provide  a quantitative assessment of their impact? If so, the results are sobering. Over the past year, no tweets hashmarked #cancerSC have earned "retweets" or "likes" beyond the single digits. (A few tweets hashmarked #OpenAccess have accumulated retweets and likes beyond the single digits, but the most popular ones have been ones that I retweeted, rather than ones that I posted myself).

How to account for these rather unimpressive statistics? Several explanations come to mind.

One is that many people who become followers on Twitter don't actually look at many tweets. Also, perhaps those who do look seldom retweet them (nor indicate that they like them).

Another explanation for the apparent lack of attention is that most of the tweets are indeed uninteresting (or far too specialized) except perhaps to a very few who do find them useful.

There are other possible explanations, but perhaps it's time to ask another question. If the individual tweets have little impact, as measured by retweets and likes, why post them?

I have two main answers to this question. Firstly, these posts provide an incentive for me to keep up with current publications about cancer stem cells. The literature on all of stem cell research is too extensive for one person to manage, but a subset of particular interest isn't. Secondly, the total number of followers (965) isn't, to my mind, a trivial number {see also above).

One last question. Are most of the followers still there because they haven't bothered to unfollow, or because they occasionally find some tweets to be of interest, or because they are interested in the author of the tweets? No relevant data. Don't know.

Best wishes for 2019!

Sunday, December 31, 2017

Tweets about cancer stem cells v2

This is Version 2 of a previous post dated September 5, 2014.

I've had a long-term interest in research on cancer in general, and cancer stem cells (CSCs) in particular. See, for example, "A stem cell model of human tumor growth: implications for tumor cell clonogenic assays", J Natl Cancer Inst. 1983 Jan;70(1):9-16 [PubMed]. I've been trying to keep up with the current literature about CSCs, and have found the task to be a challenging one.

Effective ways to filter the voluminous academic literature are badly needed. Social media have provided a possible route to this goal. I've been exploring a few such media, and especially Twitter.

I've been a member of Twitter since December 2008. I've posted over 4,500 tweets since then. Almost all of them have been about either CSCs or open access (OA).

My tweets about CSCs have included the hashtag #cancerSC. I usually post about 5-10 tweets with this hashtag per month. Previous tweets can be accessed by searching within Twitter for the #cancerSC hashtag.

As sources of information for recent news and publications about CSCs, I've used the following:

a) PubMed searches for "cancer stem", with the results sent via PubMed RSS to the RSS reader Feedly. My main focus is on articles published within the last month. PubMed is my main source of relevant information.

b) Google Alerts, to monitor the web for interesting new content about the keywords "cancer stem".

c) Occasionally, other contributors to Twitter.

These sources (especially PubMed) provide a cornucopia of information about what's new in stem cell research and development. My major challenge has been an editorial one: which aspects of all this information should be selected and tweeted about?

Screening Step 1: A useful screening tool has been the Altmetric Bookmarklet. At present, this Bookmarklet only works on PubMed, the arXiv repository, or pages containing a digital object identifier (DOI). Twitter mentions (noted by Altmetric) are only available for articles published since July 2011.

Using the bookmarklet, I screen the results sent by the PubMed RSS, and select for further examination those articles that have non-zero article level metrics. If Altmetric has picked up sharing activity around an article, I proceed to Screening Step 2. (For anyone not familiar with Altmetric.com, it's a site that provides assessments of article level metrics or altmetrics). (The Altmetric score is now called the Altmetric Attention Score).

Screening Step 2: The next screening step is to subject the title of each article to a Twitter Search, which allows one to search for tweets that have included this title. If such a search reveals at least a two tweets about the article, I go the 3rd Screening Step. I currently do a Twitter Search only if the article has a non-zero Altmetric score. My experience has been that it's extremely rare for articles with an Altmetric score of zero to yield any tweets, as assessed by a Twitter Search.

Screening Step 3: I'm a supporter of Open Access. So, I next check whether or not the article is freely accessible (no paywalls). If there are no paywalls, I prepare a tweet about the article. If I do run into a paywall, I only prepare a tweet if either the Altmetric Attention Score or the results from a Twitter Search, or my own reading of the article, yields a very positive impression. I indicate in the tweet that the article is not OA. I do this by putting ($) after the title of the article.

Some users of Twitter focus their attention on the literature related to a particular topic. One example is Hypoxia Adaptation, "A feed for hypoxia related papers published in NCBI, ArXiv, bioArxiv, and PeerJ". Another is epigenetics_papers, "Chromatin & epigenetics paper feed from #Pubmed and #Arxiv". It's unclear what criteria (other than the topic of interest) are used as the basis for tweets from these users. So, I'm currently discounting such tweets, in comparison with others that do not originate from feeds such as these.

The targeted viewers for my tweets are anyone interested in current research on CSCs. The tweets are not targeted only at those active in research on CSCs. Hence the somewhat higher priority given to articles that have no paywalls. It should be noted that only a very small percentage of articles (less than 5%) reach Screening Step 3.

Of course. there's no way to avoid some subjectivity in an editorial process of this kind. So, I occasionally ignore the results of the screening process and tweet about articles that I especially liked. And, no doubt, some interesting articles will be missed. The greater the sensitivity and specificity of the screening process, the more likely it is that all of the relevant articles will be found and the irrelevant articles rejected.

For an example of a positive view about tweets, see: Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact by Gunther Eysenbach (2011).

Examples of positive views about altmetrics are: Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact by Jason Priem, Heather Piwowar & Bradley Hemminger (2012) and Value all research products by Heather Piwowar (2013).

I'm aware of criticisms of a screening process which relies heavily on altmetrics and tweets. For examples of such criticisms, see: Twitter buzz about papers does not mean citations later by Richard Van Noorden (2013), Why you should ignore altmetrics and other bibliometric nightmares by David Colquhoun & Andrew Plested (2014) and Weaknesses of Altmetrics (undated, and authors not identified).

My own view is that tweets and altmetrics merit further exploration, as indicators of "attention". Of course, one needs to watch out for "gaming" (see: Gaming altmetrics). However, my own examination of tweets and altmetrics related to CSCs has yielded little evidence of gaming. Instead, the tweets I've seen (note that the coverage of all the altmetrics except for Twitter seems to be low) almost always appear to be the result of authentic-looking attention from real people. Occasionally, I've seen some evidence of gaming, but such articles haven't survived the screening procedure.

I do not believe that Impact Factors should be regarded as the unquestioned gold standard for indicators used to assess impact (see, for example, Impact Factors: A Broken System by Carly Strasser, 2013). Of course, the gold standard for oneself is one's own opinion upon reading a publication. But, no one can read everything.

An article, How to tame the flood of literature by Elizabeth Gibney in Nature (03 September 2014), provides comments about emerging literature-recommendation engines. I haven't yet tested all of these, but they do clearly merit attention.

I'd be very grateful for any suggestions about ways to improve the efficiency, sensitivity and specificity of a screening process of the kind outlined in this post.

Thursday, February 4, 2016

Canadian cancer stem cell Dream Team announced

A headline in today's Globe and Mail: "Canadian ‘dream team’ to probe stem-cell link to brain cancer". The article, by Ivan Semeniuk (@ivansemeniuk), is available online here.

Monday, August 31, 2015

The #cancerSC hashtag on Twitter

From the Editor: I switched some time ago to the use of Twitter, instead of this blog, as a place to post items about selected recent news or research reports related to cancer stem cells. See: https://twitter.com/hashtag/cancerSC

For a summary of the methods used to select items to be identified using the #cancerSC hashtag, see: Tweets about cancer stem cells v2.

Wednesday, October 22, 2014

SU2C Canada Cancer Stem Cell Dream Team Research Funding


Description (see: http://www.aacrcanada.ca/pages/stemcell.aspx)

The Stand Up To Cancer (SU2C) Canada Cancer Stem Cell Dream Team Research Funding represents a new, focused effort to implement advances in Cancer Stem Cell research as rapidly as possible through the creation of a collaborative, translational cancer research "Dream Team." The most talented and promising researchers across Canadian institutions will be assembled into a pan-Canadian Dream Team, forming an optimal configuration of expertise needed to solve key problems in Cancer Stem Cells and positively impact patients in the near future. This Dream Team will span multiple disciplines and utilize the new tools of modern biology, with an emphasis on genomics, to attack research questions in a coordinated way. The Dream Team will have mechanisms for sharing of resources and platforms (knowledge, talent, tools, technologies, etc.) across the Team including existing platforms and resources as well as those to be developed, incorporating new methods and technologies into the research groups, and training and networking across the Dream Team. Mechanisms to foster collaborations within and among the Dream Teams will be employed, an approach that promotes the sharing of information and a goal-oriented focus on measurable milestones of progress.
There are currently $10.6 million CAD available for this SU2C Canada Cancer Stem Cell Dream Team, with funds from the CSCC (through Genome Canada and CIHR) and SU2C Canada. The SU2C Canada Cancer Stem Cell Dream Team will be funded for a four-year term.
In addition to the funds available from the CSCC (through Genome Canada and CIHR) and SU2C Canada, the Ontario Institute for Cancer Research (OICR) has made available up to $3 million of supplemental funds to support clinical trial activities should the successful proposal include clinical trial activities within the province of Ontario (please see "SU2C Canada-OICR Cancer Clinical Trials: Canadian Dream Team Supplementary Funding" document which can be downloaded from proposalCENTRAL). It should be noted that clinical trial activities in any region of the country can be supported by the $10.6 million available from Genome Canada, CIHR, and SU2C Canada. Continued efforts to partner and collaborate with other organizations to support the objectives of the SU2C Canada Cancer Stem Cell Dream Team are ongoing. As other partnerships are confirmed the relevant information will be communicated to potential applicants. Applicants are also encouraged to explore potential partnering and collaboration opportunities.

The SU2C website is at: http://www.standup2cancer.org/

SU2C on Twitter is at: https://twitter.com/SU2C