
In this article, we show the mythical relationship between science and information. Since every substance has a price-tag or price-tags of information which includes all the building blocks in our universe, one can not simply ignore information when dealing with science. We have shown that there is a profound connection between information and entropy, a quantity that has been well accepted in science. Without this connection, information would be more difficult to apply to science. Two of the most important pillars in modern physics must be the Einstein's relativity theory and the Schrodinger's quantum mechanics. We show that there exists a profound relationship between them, by means of the uncertainty principle. In due of the uncertainty relation, we show that every bit of information takes time and energy to transfer, to create and to observe. Since one cannot create something from nothing, we show that, anything to be created needs a huge amount of energy and requires a great deal of entropy to make it happen! My question is that, can we afford it?
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 2 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
