Abstract
In the early to mid 20th Century, reductionism as a concept in biology was challenged by key thinkers, including Ludwig Von Bertalanffy. He proposed that living organisms were specific examples of complex systems and, as such, they should display characteristics including hierarchical organisation and emergent behaviour. Yet the true study of complete biological systems (for example, metabolism) was not possible until technological advances that occurred 60 years later. Technology now exists that permits the measurement of complete levels of the biological hierarchy, for example the genome and transcriptome. The complexity and scale of these data require computational models for their interpretation. The combination of these - systems thinking, high-dimensional data and computation – defines systems biology, typically accompanied by some notion of iterative model refinement. Only sequencing-based technologies, however, offer full coverage. Other 'omics' platforms trade coverage for sensitivity, although the densely-connected nature of biological networks suggest that full coverage may not be necessary. Systems biology models are often characterised as either 'bottom-up' (mechanistic) or 'top-down' (statistical). This distinction can mislead, as all models rely on data and all are, to some degree, 'middle-out'. Systems biology has matured as a discipline, and its methods are commonplace in many laboratories. However, many challenges remain, especially those related to large-scale data integration.
This article is protected by copyright. All rights reserved
from Physiology via xlomafota13 on Inoreader http://ift.tt/2idcgMp
via IFTTT
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου
Σημείωση: Μόνο ένα μέλος αυτού του ιστολογίου μπορεί να αναρτήσει σχόλιο.