"Structure has always been the next frontier. Computational biologists tend to talk about the twilight zone to speak about this region of the sequence space of proteins where molecular structures are still vaguely known and susceptible of further refinement. A realm of sequences where, as in Plato’s Cave, appearances are deceiving and inference by homology needs necessarily to be ruled out...."
"In some labs, biology and computer science are converging. On the one hand, computer scientists are working towards creating computer chips inspired by the circuitry of ..."
A nice news and views. Thanks Christina Agapakis. Very interesting that you mentioned Systems Theory. This provides in my opinion the new paradigm in science. We have moved from a reductionistic hypothesis driven science to a data driven scientific concept. Two computer developments are responsible for the paradigm change. The first is the enormous storage capacity in the cloud. We talk about a magnitude in the petabyte scale. One petabyte is equal to one quadrillion bytes or 1000 tetrabytes. This is 10exp15 bytes. Google processes about 24 petabytes per day. The second is that a huge number of computers have been connected and organized in social networks. a huge number of computers are now connected via the Internet. We now have access to the huge data- amount from many locations. In addition: We have twitter, facebook and many other social networks, which give a context for the enormous data-amount.
These changes have resulted in huge quantities of data and complex systems. We have a data flooding. This a problem normal science cannot solve. The structures of the science world were designed to fit a pre-computer age. The hypothesis method can deal with simple correlations between A and B. But the method fails if the problem becomes more complex with many factors, eg A to I or even more. This is too complex for deducing an empirical consequence.
A novel theoretical base for science has evolved. Groundbreaking theories have been published: As you have mentioned: 1948 – Norbert Wiener: Cybernetics or Control and communication in the animal and machine. Moreover interesting is also: 1955 – William Ross Ashby: Introduction to cybernetics. 1968 – Ludwig Bertalanffy: General System theory: Foundations, Development, Applications. Notable further developments of systems theory are: Heinz Foerster’s second order-cybernetics, Ilya Prigogine’s work on self-organization and his systems theory concepts with thermodynamics, and Mitchell Feigenbaum’s work on chaos theory. Systems theory is a fundament for software and thus influences the scientific method. Contemporary applications of systems theory are systems biology and synthetic biology. This philosophical movement has two components: Idealism and systems theory. Idealism: Idealism is based on Plato’s theory of forms (ideas). In line with this theory, cybernetics assumes that the human nervous system calculates reality. This means: Our brain calculates a model of an object. The human reception is the basis of the scientific method. The key tools are mathematics and logic. The result of this calculation is a model, which is not identical to the object. Thus, it is impossible to achieve knowledge about the world such as it exists independent of us Systems theory: In systems theory, contemporary science moves from reductionism to a more holistic position. A system is a set of interacting or independent components forming an integrated whole. Systems behavior involves input, processing and output of data, and can be self-organizing and self-regulating by feedback. Ref http://bit.ly/kjOVs2 and http://bit.ly/lMbKc8
DNA2.0 announces the integration of the first standardized information exchange framework for synthetic biology into the company's breakthrough gene design and assembly application. Menlo Park, CA (PRWEB) December 14,...
(PhysOrg.com) -- When it comes to transporting a cell's valuable electrons, the metal-reducing microbe Shewanella oneidensis only trusts stable, mature proteins, according to scientists at Pacific Northwest National Laboratory.
"Cell factories are envisioned as future workhorse manufacturers of pharmaceuticals, biofuels and biomaterials. Two main engines work in coordination inside cell factory’s machinery: gene regulatory networks and biochemical pathways. Both networks are amenable to an engineering approach for their modeling, design and validation. Therefore, off-the-shelf implementations of these bioengineering appliances are becoming more widespread with the advent of rational synthetic biology. Take for instance a cell factory for the synthesis of natural products or secondary metabolites..."
I have referred to iPSCs in a previous post as 'franken-cells' and questioned whether they can offer important biological insights to the world. In this post, I'll eat my words and offer some examples where their artificiality is an asset.