Wednesday, October 22, 2014

SU2C Canada Cancer Stem Cell Dream Team Research Funding

Description (see:

The Stand Up To Cancer (SU2C) Canada Cancer Stem Cell Dream Team Research Funding represents a new, focused effort to implement advances in Cancer Stem Cell research as rapidly as possible through the creation of a collaborative, translational cancer research "Dream Team." The most talented and promising researchers across Canadian institutions will be assembled into a pan-Canadian Dream Team, forming an optimal configuration of expertise needed to solve key problems in Cancer Stem Cells and positively impact patients in the near future. This Dream Team will span multiple disciplines and utilize the new tools of modern biology, with an emphasis on genomics, to attack research questions in a coordinated way. The Dream Team will have mechanisms for sharing of resources and platforms (knowledge, talent, tools, technologies, etc.) across the Team including existing platforms and resources as well as those to be developed, incorporating new methods and technologies into the research groups, and training and networking across the Dream Team. Mechanisms to foster collaborations within and among the Dream Teams will be employed, an approach that promotes the sharing of information and a goal-oriented focus on measurable milestones of progress.
There are currently $10.6 million CAD available for this SU2C Canada Cancer Stem Cell Dream Team, with funds from the CSCC (through Genome Canada and CIHR) and SU2C Canada. The SU2C Canada Cancer Stem Cell Dream Team will be funded for a four-year term.
In addition to the funds available from the CSCC (through Genome Canada and CIHR) and SU2C Canada, the Ontario Institute for Cancer Research (OICR) has made available up to $3 million of supplemental funds to support clinical trial activities should the successful proposal include clinical trial activities within the province of Ontario (please see "SU2C Canada-OICR Cancer Clinical Trials: Canadian Dream Team Supplementary Funding" document which can be downloaded from proposalCENTRAL). It should be noted that clinical trial activities in any region of the country can be supported by the $10.6 million available from Genome Canada, CIHR, and SU2C Canada. Continued efforts to partner and collaborate with other organizations to support the objectives of the SU2C Canada Cancer Stem Cell Dream Team are ongoing. As other partnerships are confirmed the relevant information will be communicated to potential applicants. Applicants are also encouraged to explore potential partnering and collaboration opportunities.

The SU2C website is at:

SU2C on Twitter is at:

Tuesday, October 14, 2014

Stand Up To Cancer Canada Dream Teams

This tweet was posted today: The link embedded in the tweet takes one to a press release entitled: "SU2C Canada Announces Up To $22.6 Million CAD Available for Stand Up To Cancer Canada Dream Teams". Two excerpts:
Two Dream Team funding opportunities are available, one for translational research focused on breast cancer and the other on cancer stem cells:
The Stand Up To Cancer Canada Cancer Stem Cell Dream Team will provide approximately $10.6 million CAD over a four-year term, with funds from the CSCC (through Genome Canada and CIHR) and SU2C Canada. The Call for Ideas seeks to support a single, integrated, and cohesive pan-Canadian team bringing together key stakeholders – researchers, clinicians, industry, nongovernmental organizations, and funders – with the goal of improving the outcomes of hard-to-treat cancers by focusing on the role of cancer stem cells and stem cell programs on resistance and treatment failure in cancer. The team will employ new tools of modern biology, with an emphasis on genomics.
• Additionally, the two qualifying Dream Teams may each receive supplementary funds up to $3 million over four years from OICR, to support clinical trial activities in the province of Ontario.

Friday, September 5, 2014

Tweets about cancer stem cells

I've had a long-term interest in research on cancer in general, and cancer stem cells (CSCs) in particular. See, for example, "A stem cell model of human tumor growth: implications for tumor cell clonogenic assays", J Natl Cancer Inst. 1983 Jan;70(1):9-16 [PubMed]. I've been trying to keep up with the current literature about CSCs, and have found the task to be a challenging one.

Effective ways to filter the voluminous academic literature are badly needed. Social media have provided a possible route to this goal. I've been exploring a few such media, and especially Twitter.

I've been a member of Twitter since December 2008. I've posted over 3000 tweets since then. Almost all of them have been about either CSCs or open access (OA).

My tweets about CSCs have included the hashtag #cancerSC. I usually post about 15-25 tweets with this hashtag per month. Previous tweets can be accessed by searching within Twitter for the #cancerSC hashtag. A Google search for the same hashtag will provide access to the same archive of tweets.

As sources of information for recent news and publications about CSCs, I've used the following:

a) Google Alerts, to monitor the web for interesting new content about the keywords "cancer stem".

b) PubMed searches for "cancer stem", with the results sent via PubMed RSS to the RSS reader Feedly. My main focus is on articles published within the last month. PubMed is my main source of relevant information.

c) Other contributors to Twitter, such as @cancerscnews.

These sources (especially PubMed) provide a cornucopia of information about what's new in stem cell research and development. My major challenge has been an editorial one: which aspects of all this information should be selected and tweeted about?

Screening Step 1: A useful screening tool has been the Altmetric Bookmarklet. At present, this Bookmarklet only works on PubMed, the arXiv repository, or pages containing a digital object identifier (DOI).

Using the bookmarklet, I screen the results sent by the PubMed RSS, and select for further examination those articles that have non-zero article level metrics. Whether or not Altmetric has picked up sharing activity around an article, I proceed to Screening Step 2. (For those not familiar with, it's a start-up that attempts to assess article level metrics or altmetrics).

Screening Step 2: The next screening step is to put the title of each article into Topsy, which allows one to search for tweets that have included this title. If a search using Topsy reveals at least a two tweets about the article, I go the 3rd Screening Step. Sometimes, secondary sources (such as 7thSpace Interactive) are identified in multiple tweets. If so, I again go to Screening Step 3.

Screening Step 3: I'm a supporter of the Open Access movement. So, I next check whether or not the article is freely accessible (no paywalls). If there are no paywalls, I prepare a tweet about the article. If I do run into a paywall, I only prepare a tweet if either the Altmetric, or the results from Topsy, or my own reading of the article, yields a very positive impression. I indicate in the tweet that the article is "not OA".

The targeted viewers for my tweets are anyone interested in current research on CSCs. The tweets are not targeted only at those active in research on CSCs. Hence the somewhat higher priority given to articles that have no paywalls.

Of course. there's no way to avoid some subjectivity in an editorial process of this kind. So, I occasionally ignore the results of the screening process and tweet about articles that I especially liked. And, no doubt, some interesting articles will be missed. The greater the sensitivity and specificity of the screening process, the more likely it is that all of the relevant articles will be found and the irrelevant articles rejected.

For an example of a positive view about tweets, see: Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact by Gunther Eysenbach (2011).

Examples of positive views about altmetrics are: Altmetrics in the Wild: Using Social Media to Explore Scholarly Impact by Jason Priem, Heather Piwowar & Bradley Hemminger (2012); Value all research products by Heather Piwowar (2013).

I'm aware of criticisms of a screening process which relies heavily on altmetrics and tweets. For examples of such criticisms, see: Twitter buzz about papers does not mean citations later by Richard Van Noorden (2013); Why you should ignore altmetrics and other bibliometric nightmares by David Colquhoun & Andrew Plested (2014); Article-level metrics: An ill-conceived and meretricious idea by Jeffrey Beall (2013).

My own view is that tweets and altmetrics merit further exploration, as indicators of "attention". Of course, one needs to watch out for "gaming" (see: Gaming altmetrics). However, my own examination of tweets and altmetrics related to CSCs has yielded no unequivocal evidence of "gaming". Instead, the tweets I've seen (the coverage of all the altmetrics except for Twitter seems to be low) appear to be the result of authentic-looking attention from real people.

I do not believe that Impact Factors should be regarded as the unquestioned gold standard for indicators used to assess impact (see, for example, Impact Factors: A Broken System by Carly Strasser, 2013). Of course, the gold standard for oneself is one's own opinion upon reading a publication. But, no one can read everything.

An article, How to tame the flood of literature by Elizabeth Gibney in Nature (03 September 2014), provides comments about emerging literature-recommendation engines. I haven't yet used any of these, but they clearly merit attention.

I'd be very grateful for any suggestions about ways to improve the efficiency, sensitivity and specificity of a screening process of the kind outlined in this post.