Showing results 1281 to 1300 on 1839 in total
The Open Tree of Life project is a collaborative effort to synthesize, share, and update a comprehensive phylogeny of all 2.3 million named species. We have completed a draft synthesis of a single tree from hundreds of phylogenetic estimates using taxonomy as a scaffold. This synthesis is not static but rather will be continually revised as new data become available. This undertaking requires development of both novel infrastructure and analysis tools. I will discuss three components of this project: Phylesystem, an open database and web application for community curation of phylogenies using a git-based datastore, PhyScraper, a pipeline to continually update phylogenetic estimates as new data is generated, and FastDate, an algorithm to rapidly generate maximum a posteriori estimates of time-calibrated trees, even for phylogenies with hundreds or thousands of tips. Together, these developments reduce impediments to accessing, analyzing and reusing the phylogenetic information which is essential to biological research today.
-
The genome of bacteria is classically separated into essential, stable and slow evolving replicons (chromosomes) and accessory, mobile and rapidly evolving replicons (plasmids). This paradigm is being questioned since the discovery of genomic elements that possess both chromosomal and plasmidic features. These Extra-Chromosomal Essential Replicons (ECERs), be they called "megaplasmids", "secondary chromosomes" or "chromids", are found in diverse lineages across the bacterial phylogeny and are generally believed to be modified plasmids. However, their true nature and the mechanisms permitting their integration within the sable genome are yet to be formally determined. The relationships between replicons, with reference to their Genetic Information Inheritance Systems (GIIS), were explored under the assumption that the inheritance of ECERs is integrated to the cell cycle and highly constrained in contrast to that of standard plasmids. A global comparative genomics analysis including all available complete bacterial genome sequences, was performed using GIIS functional homologues as parameters and applying several analytical procedures. GIIS proved appropriate in characterizing the level of integration within the stable genome, as well as the origins, of the replicons. The study of ECERs thus provides clues to the genetic mechanisms and evolutionary processes involved in the replicon stabilization into the essential genome and to the continuity of the genomic material.
-
Human tissues constantly replace dying cells with newborn cells. The pace at which they are replaced, however, varies by orders of magnitudes between blood cells, which are renewed every day and neurons, for which renewal is non-existent or limited to specific regions of the brain. Between those extreme are many tissues that turnover on a time scale of years, although no direct measurements have been done. We present here a mathematical method to estimate cell turnover in slowly renewing biological systems. Age distribution of DNA can be estimated from the integration of radiocarbon derived from nuclear bomb-testing during the cold war (1955-1963). For slowly renewing tissues, this method provides a better estimate of the average age of the tissue than direct estimates from the bomb-curve. Moreover, death, birth and turnover rates can be estimated. We highlight this method with data from hippocampal neurons and cardiomyocytes.
L'accident vasculaire cérébral est une cause majeure de handicap acquis chez l'adulte. Les biothérapies ont montré un effet bénéfique sur la plasticité neuronale (synaptogénèse, angiogenèse, repousse dendritique et immunomodulation) et la récupération neurologique en ischémie expérimentale. D'après les critères du consortium STEP sur les biothérapie dans l'AVC, la démonstration de l'effet des CSM requiert plusieurs étapes: 1) Un nombre plus important de patients dans les études ; 2) Une meilleure compréhension des mécanismes de régénération cellulaire et de réorganisation neuronale chez l'homme afin de déterminer les patients susceptibles de répondre au traitement ; 3) Une évaluation quantitative de l'effet des CSM par l'introduction de biomarqueurs de neuroimagerie, en particulier de l'imagerie fonctionnelle et de la mesure de l'intégrité de la substance blanche en imagerie de diffusion. Le projet HERMES a tenté d'identifier des biomarqueurs d'IRM dans le cadre d'un essai clinique ISIS (PHRC2007 CHU de Grenoble) incluant 31 patients ayant présenté un AVC de moins de 2semaines. Le protocole d'imagerie cérébrale a permis de modéliser la récupération motrice en fonction de l'étude en IRM fonctionnelle de l'activation ces réseaux moteurs par une tâche motrice (IRMf), de la connectivité structurale par l'étude du tenseur de diffusion (DTI), et de la connectivité fonctionnelle de repos (rs-IRMf), en ajoutant l'étude de la perfusion cérébrale et de la réserve cérébro-vasculaire au CO2. L'intérêt de ces biomarqueurs d'imagerie est également d'évaluer l'effet du traitement sur la récupération motrice à 6 mois. La validation de des biomarqueurs de neuroimagerie, permettra de tester leur utilisation dans le cadre de l'essai clinique multicentrique européen (RESSTORE) testant l'effet des cellules souches allogéniques dans l'AVC subaigu incluant 400 patients. Contact : Mme Mariethé CHAUMEIL Inscription gratuite mais obligatoire avant LUNDI 05 OCTOBRE 2015 Courriel : mariethe.chaumeil@chu-lyon.fr
Le registre du Rhône, qui fête cette année ses 20 ans, recense l'ensemble des victimes des accidents de la route corporels survenus dans le département du Rhône. Pour chacune des victimes étant passées par les services de soins hospitaliers, un bilan lésionnel complet est connu, selon la classification AIS (Abbreviated Injury Scale). L'objectif de ce travail est de décrire ces tableaux lésionnels, et notamment les associations privilégiées entre les différentes lésions. D'autre part, la question de la spécificité des associations en fonction du type d'usager (automobiliste, motard, cycliste ou piéton typiquement) sera également abordée. Après avoir rappelé les principes généraux des modèles graphiques binaires et de l'estimation pénalisée, nous montrerons comment le choix d'une pénalité adaptée permet de répondre à l'ensemble de ces questions sur les données du Registre du Rhône.Contact : Mme Mariethé CHAUMEIL Inscription gratuite mais obligatoire avant LUNDI 02 novembre 2015 Courriel : mariethe.chaumeil@chu-lyon.fr
-
-
-
-
-
-
-
Longitudinal studies of disease progression and treatment increasingly involve time-varying treatments. Many such treatments may have cumulative effects, where the risk of the outcome does depend not only on the current or most recent treatment status or dose but also on the history of the past treatment. One important analytical challenge in such studies concerns the need to specify an 'etiologically correct exposure metric' that summarizes the impact of treatment/exposure history on the current hazard. Flexible modeling of a weighted cumulative exposure (WCE), where the exposure metric is defined as a weighted sum of past treatments, has been proposed to address this challenge and the WCE model has been shown to incorporate conventional simpler exposure models as its special cases. Another important challenge in assessing the causal effects of time-varying treatments occurs if the treatment both affects (future) and depends on (past) values of a time-varying risk factor. Such risk factor will act then as both a confounder and a mediator of the estimated treatment effect. Marginal Structural Models (MSM) have been developed and demonstrated to provide un-biased treatment effect estimates in the presence of such time-varying confounders/mediators. We propose, and validate in simulations, a new, flexible model that combines the MSM and the WCE methodologies. The new model is a flexible extension of the weighted Cox MSM, with inverse-probability of treatment (IPT) weights. To estimate the cumulative effect of the past treatments, we use use cubic regression splines to estimate the marginal weight function, which estimates the relative importance weights assigned to the past exposures, depending on the time elapsed since the exposure. The new WCE model is implemented by inserting the artificial time-dependent (TD) covariates into the Cox model. Stabilized IPT TD weights are employed to control for TD confounders / mediators of the treatment effect. Simulations demonstrate that our MSM WCE estimates well capture the total causal effect of time-varying treatments i.e. the sum of (i) its direct effect on the hazard, and (ii) its indirect effect, mediated through changes in the TD confounder/mediator. Furthermore, if the indirect effect is moderate or strong, the estimated marginal cumulative treatment effect may be substantially stronger than the effect estimate from the conventional (un-weighted) 'conditional' WCE model. Xiao Y, Abrahamowicz M, Moodie EEM, Weber R, Young J. Flexible Marginal Structural Models for Estimating the Cumulative Effect of a Time-Dependent Treatment on the Hazard: Reassessing the Cardiovascular Risks of Didanosine Treatment in the Swiss HIV Cohort Study. Journal of the American Statistical Association. Jan 2014; Epub [DOI: 10.1080/01621459.2013.872650] Merci de me confirmer votre présence avant le jeudi 17 décembre 2015 en raison des congés de fin d'année Cordialement.
-
Several protein isoforms can be produced from a single gene through alternative splicing. These isoforms have different protein sequences, and often diverse or even antagonistic functions. In the recent years,the use of high-throughput technologies has revealed that alternative splicing is massively deregulated in many experimental conditions. A proportion of the splicing events observed at the transcript level are also observed at the protein level. However, it is still difficult to decipher the functional consequences of these splicing variations because of the lack of functional information at the exon level. To circumvent that problem, we introduce a computational strategy that relies on the functional annotation of exons in order to predict the consequences of their inclusion or skipping.
Du fait du manque d'outils spécifiques, les données ordinales sont souvent assimilées soit à des données nominales, oubliant la notion d'ordre, soit à des données quantitatives, introduisant artificiellement une notion de distance entre modalités. Dans le but d'éviter l'utilisation d'une de ces deux solutions extrêmes, nous proposons une nouvelle distribution de probabilité pour données ordinales, paramétrée par un paramètre de position et un paramètre de précision. Cette distribution est ensuite utilisée pour définir un algorithme de clustering spécifique aux données ordinales, permettant de prendre en compte les données multivariées et potentiellement manquantes. Cet algorithme a été implémenté dans des solutions logiciels (package R, logiciel en ligne SaaS), qui seront utilisés pour une démonstration sur données réelles.Contact : Mme Mariethé CHAUMEIL Inscription gratuite mais obligatoire au plus tard le lundi 29 février 2016 Courriel : mariethe.chaumeil@chu-lyon.fr
-
-