Erik.rikard.daniel
Shared posts
Most paintings by Turner and Monet show stylistic evolution, not changes in pollution
SAAP-148 Eradicates MRSA Persisters Within Mature Biofilm Models Simulating Prosthetic Joint Infection
Prosthetic joint infection (PJI) is a severe complication of arthroplasty. Due to biofilm and persister formation current treatment strategies often fail. Therefore, innovative anti-biofilm and anti-persister agents are urgently needed. Antimicrobial peptides with their broad antibacterial activities may be such candidates. An in vitro model simulating PJI comprising of rifampicin/ciprofloxacin-exposed, mature methicillin-resistant Staphylococcus aureus (MRSA) biofilms on polystyrene plates, titanium/aluminium/niobium disks, and prosthetic joint liners were developed. Bacteria obtained from and residing within these biofilms were exposed to SAAP-148, acyldepsipeptide-4, LL-37, and pexiganan. Microcalorimetry was used to monitor the heat flow by the bacteria in these models. Daily exposure of mature biofilms to rifampicin/ciprofloxacin for 3 days resulted in a 4-log reduction of MRSA. Prolonged antibiotic exposure did not further reduce bacterial counts. Microcalorimetry confirmed the low metabolic activity of these persisters. SAAP-148 and pexiganan, but not LL-37, eliminated the persisters while ADEP4 reduced the number of persisters. SAAP-148 further eradicated persisters within antibiotics-exposed, mature biofilms on the various surfaces. To conclude, antibiotic-exposed, mature MRSA biofilms on various surfaces have been developed as in vitro models for PJI. SAAP-148 is highly effective against persisters obtained from the biofilms as well as within these models. Antibiotics-exposed, mature biofilms on relevant surfaces can be instrumental in the search for novel treatment strategies to combat biofilm-associated infections.
Trump’s science-adviser pick hedges on climate change
Trump’s science-adviser pick hedges on climate change
Trump’s science-adviser pick hedges on climate change, Published online: 23 August 2018; doi:10.1038/d41586-018-06019-7
Meteorologist Kelvin Droegemeier offered few clues to his views on the topic when grilled at a US Senate hearing.A plasmid on the road to becoming a virus
New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens
New fossils from Jebel Irhoud, Morocco and the pan-African origin of Homo sapiens
Nature 546, 7657 (2017). doi:10.1038/nature22336
Authors: Jean-Jacques Hublin, Abdelouahed Ben-Ncer, Shara E. Bailey, Sarah E. Freidline, Simon Neubauer, Matthew M. Skinner, Inga Bergmann, Adeline Le Cabec, Stefano Benazzi, Katerina Harvati & Philipp Gunz
Fossil evidence points to an African origin of Homo sapiens from a group called either H. heidelbergensis or H. rhodesiensis. However, the exact place and time of emergence of H. sapiens remain obscure because the fossil record is scarce and the chronological age of many key specimens remains uncertain. In particular, it is unclear whether the present day ‘modern’ morphology rapidly emerged approximately 200 thousand years ago (ka) among earlier representatives of H. sapiens or evolved gradually over the last 400 thousand years. Here we report newly discovered human fossils from Jebel Irhoud, Morocco, and interpret the affinities of the hominins from this site with other archaic and recent human groups. We identified a mosaic of features including facial, mandibular and dental morphology that aligns the Jebel Irhoud material with early or recent anatomically modern humans and more primitive neurocranial and endocranial morphology. In combination with an age of 315 ± 34 thousand years (as determined by thermoluminescence dating), this evidence makes Jebel Irhoud the oldest and richest African Middle Stone Age hominin site that documents early stages of the H. sapiens clade in which key features of modern morphology were established. Furthermore, it shows that the evolutionary processes behind the emergence of H. sapiens involved the whole African continent.
[Report] A four-legged snake from the Early Cretaceous of Gondwana
Optimal metabolic route search based on atom mappings
Motivation: A key computational problem in metabolic engineering is finding efficient metabolic routes from a source to a target compound in genome-scale reaction networks, potentially considering the addition of new reactions. Efficiency can be based on many factors, such as route lengths, atoms conserved and the number of new reactions, and the new enzymes to catalyze them, added to the route. Fast algorithms are needed to systematically search these large genome-scale reaction networks.
Results: We present the algorithm used in the new RouteSearch tool within the Pathway Tools software. This algorithm is based on a general Branch-and-Bound search and involves constructing a network of atom mappings to facilitate efficient searching. As far as we know, it is the first published algorithm that finds guaranteed optimal routes where atom conservation is part of the optimality criteria. RouteSearch includes a graphical user interface that speeds user understanding of its search results. We evaluated the algorithm on five example metabolic-engineering problems from the literature; for one problem the published solution was equivalent to the optimal route found by RouteSearch; for the remaining four problems, RouteSearch found the published solution as one of its best-scored solutions. These problems were each solved in less than 5 s of computational time.
Availability and implementation: RouteSearch is accessible at BioCyc.org by using the menu command RouteSearch and by downloading Pathway Tools. Pathway Tools software is freely available to academic users, and for a fee to commercial users. Download from: http://biocyc.org/download.shtml.
Contact: mario.latendresse@sri.com
Supplementary information: Supplementary data are available at Bioinformatics online.
Expanding the Marine Virosphere Using Metagenomics
by Carolina Megumi Mizuno, Francisco Rodriguez-Valera, Nikole E. Kimes, Rohit Ghai
Viruses infecting prokaryotic cells (phages) are the most abundant entities of the biosphere and contain a largely uncharted wealth of genomic diversity. They play a critical role in the biology of their hosts and in ecosystem functioning at large. The classical approaches studying phages require isolation from a pure culture of the host. Direct sequencing approaches have been hampered by the small amounts of phage DNA present in most natural habitats and the difficulty in applying meta-omic approaches, such as annotation of small reads and assembly. Serendipitously, it has been discovered that cellular metagenomes of highly productive ocean waters (the deep chlorophyll maximum) contain significant amounts of viral DNA derived from cells undergoing the lytic cycle. We have taken advantage of this phenomenon to retrieve metagenomic fosmids containing viral DNA from a Mediterranean deep chlorophyll maximum sample. This method allowed description of complete genomes of 208 new marine phages. The diversity of these genomes was remarkable, contributing 21 genomic groups of tailed bacteriophages of which 10 are completely new. Sequence based methods have allowed host assignment to many of them. These predicted hosts represent a wide variety of important marine prokaryotic microbes like members of SAR11 and SAR116 clades, Cyanobacteria and also the newly described low GC Actinobacteria. A metavirome constructed from the same habitat showed that many of the new phage genomes were abundantly represented. Furthermore, other available metaviromes also indicated that some of the new phages are globally distributed in low to medium latitude ocean waters. The availability of many genomes from the same sample allows a direct approach to viral population genomics confirming the remarkable mosaicism of phage genomes.[Perspective] An Enzymatic Route to H2 Storage
Should phylogenetic modelling proceed from simple to complex or vice versa?
In statistical model testing, models can be tested by starting with the simplest model and progressively adding model complexity until the desired level of model fit is achieved. Alternatively, one can start with the most complex model and progressively delete unnecessary components while maintaining the desired level of model fit. The first approach is constructive, in the sense that the model is constructed piece by piece (stepwise addition), while the second approach is reductive, in the sense that the full model is pared down to its simplest form (stepwise deletion).
This distinction in approaches to modelling is relevant to the difference between using trees and networks as phylogenetic models.
At the moment, the most common approach to phylogenetic analysis is the constructive one. One starts with the simplest model, a bifurcating tree, and assesses the degree to which it fits the data. If the fit is poor, as it often is with multi-gene data, especially if the gene data are concatenated, then complexity is added. For example, one might include incomplete lineage sorting (ILS) in the model, which allows the different genes to fit different trees, while still maintaing the need for a single dichotomous species tree. Alternatively, one might consider gene duplication-loss as a possible addition to the model, which is another major source of incompatibility between multi-gene data and a single species tree. Only if these additional complexities also fail to attain the desired degree of fit does one consider adding components of reticulate evolution to the model, such as hybridization or horizontal gene transfer (HGT).
The reductive (or simplification) approach, however, proceeds the other way. A general network model is used as the starting point. The various components of this model would include a dichotomous tree as a special case, along with ILS, duplication-loss, hybridization, and HGT as individual components. These special cases are evaluated simultaneously, and each one is dropped if it is contributing nothing worthwhile to the model fit. The final model consists of the simplest combination of components that still maintains the specified fit of data and model; this may indeed be a simple tree.
The main advantage of the latter approach is that all of the components of the model are evaluated simultaneously, so that their potential interactions can be quantitatively assessed. Components are dropped from the model only if they contribute nothing to the model, either independently or in synergy with the other components. That is, they are dropped only if they can be shown to be redundant.
This does not happen with the constructive approach to modelling. Here, the components are evaluated in some specified order, and components that are later in the order will not be evaluated unless the earlier components prove to be inadequate. These later components are thus potentially excluded from statistical consideration. This means that their possible contribution to biological explanation may never be quantitatively assessed.
So, in practice, evolutionary reticulation is considered to be a "last resort" in current phylogenetic analyses. It is considered as a possible biological explanation only if all else has already failed.
This philosophy seems to be as much a historical artifact as anything else. The first phylogenetic diagrams (by Buffon and Duchesne) were networks not trees, but they were replaced a century later by the tree model suggested by Darwin; and the tree has retained its primacy since that time. This leads naturally to the constructive approach to modelling, which is so prevalent in the current literature.
However, there is no necessary statistical superiority of the constructive approach to modelling. Indeed, statisticians seem to consider forward and backward selection of model components to be essentially equivalent, although they may lead to different models for any given dataset. The most commonly specified advantage of the constructive approach to modelling is that it is likely to avoid possible problems arising from having too many components in the model.
Nevertheless, the reductive approach has the distinct advantage of simultaneously evaluating all possible special cases of a network, and thus does not exclude any possible biological explanation that might apply to the observed data. This may provide more biological insight than does the construcive approach to phylogenetic modelling.