5th International Electronic Conference on Entropy and Its Applications

Conference Information
Name: 5th International Electronic Conference on Entropy and Its Applications
Date: 2019-11-18 - 2019-11-30

Latest articles from this conference

Published: 14 March 2020
by MDPI
Proceedings, Volume 46; doi:10.3390/ecea-5-06675

Abstract:
A graph-model is presented to describe multilevel atomic structure. As an example, we take the double Λ configuration in alkali-metal atoms with hyperfine structure and nuclear spin I = 3 / 2 , as a graph with four vertices. Links are treated as coherence. We introduce the transition matrix which describes the connectivity matrix in static graph-model. In general, the transition matrix describes spatiotemporal behavior of the dynamic graph-model. Furthermore, it describes multiple connections and self-looping of vertices. The atomic excitation is made by short pulses, in order that the hyperfine structure is well resolved. Entropy associated with the proposed dynamic graph-model is used to identify transitions as well as local stabilization in the system without invoking the energy concept of the propagated pulses.
Published: 13 March 2020
by MDPI
Proceedings, Volume 46; doi:10.3390/ecea-5-06682

Abstract:
There are various approaches to the problem of how one is supposed to conduct a statistical analysis. Different analyses can lead to contradictory conclusions in some problems so this is not a satisfactory state of affairs. It seems that all approaches make reference to the evidence in the data concerning questions of interest as a justification for the methodology employed. It is fair to say, however, that none of the most commonly used methodologies is absolutely explicit about how statistical evidence is to be characterized and measured. We will discuss the general problem of statistical reasoning and the development of a theory for this that is based on being precise about statistical evidence. This will be shown to lead to the resolution of a number of problems.
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06709

Abstract:
A key element of thermodynamics is small entropy fluctuations away from local maxima at equilibrium. These fluctuating entropy decreases become proportionally larger as the volume decreases within some thermodynamic system. Lessened entropy indicates the formation of organized fluctuating mesoscopic structures. These structures depend upon the microscopic interactions present among the atomic constituents of the system. Entropy fluctuations may be represented by a thermodynamic information metric yielding directly a thermodynamic Ricci curvature scalar R. R is a thermodynamic invariant that is a measure of mesoscopic structure formation within the system. In my talk, I discuss the calculation and the physical interpretation of R in several scenarios: fluid systems, including supercooled liquid water, simple solids, spin systems, quantum fluid models, the quark-meson plasma, and black hole thermodynamics. This range of applications offers a strong argument for the effectiveness of R within thermodynamics.
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06710

Abstract:
The development of systematic coarse-grained mesoscopic models for complex molecular systems is an intense research area. Here we first give an overview of different methods for obtaining optimal parametrized coarse-grained models, starting from detailed atomistic representation for high dimensional molecular systems. We focus on methods based on information theory, such as relative entropy, showing that they provide parameterizations of coarse-grained models at equilibrium by minimizing a fitting functional over a parameter space. We also connect them with structural-based (inverse Boltzmann) and force matching methods. All the methods mentioned in principle are employed to approximate a many-body potential, the (n-body) potential of mean force, describing the equilibrium distribution of coarse-grained sites observed in simulations of atomically detailed models. We also present in a mathematically consistent way the entropy and force matching methods and their equivalence, which we derive for general nonlinear coarse-graining maps. We apply, and compare, the above-described methodologies in several molecular systems: a simple fluid (methane), water and a polymer (polyethylene) bulk system. Finally, for the latter we also provide reliable confidence intervals using a statistical analysis resampling technique, the bootstrap method.
Alejandro Chinea Manrique De Lara
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06694

Abstract:
The notion that the brain has a resting state mode of functioning has received increasing attention in recent years. The idea derives from experimental observations that showed a relatively spatially and temporally uniform high level of neuronal activity when no explicit task was being performed. Surprisingly, the total energy consumption supporting neuronal firing in this conscious awake baseline state is orders of magnitude larger than the energy changes during stimulation. This paper presents a novel and counter-intuitive explanation of the high energy consumption of the brain at rest obtained using the recently developed intelligence and embodiment hypothesis. This hypothesis is based on evolutionary neuroscience and postulates the existence of a common information-processing principle associated with nervous systems that evolved naturally and serves as the foundation from which intelligence can emerge and to the efficiency of brain's computations. The high energy consumption of the brain at rest is shown to be related to the most probable state of an equilibrium statistical mechanics model aimed at capturing the behavior of a system constrained by power consumption and evolutionary designed to minimize metabolic demands.
Published: 18 November 2019
by MDPI
Proceedings, Volume 46; doi:10.3390/ecea-5-06693

Abstract:
The aim of this work was to analyze in the Entropy–Complexity plane (HxC) time series coming from ECG, with the objective to discriminate recordings from two different groups of patients: normal sinus rhythm and cardiac arrhythmias. The HxC plane used in this study was constituted by Shannon’s Entropy as one of its axes, and the other was composed using statistical complexity. To compute the entropy, the probability distribution function (PDF) of the observed data was obtained using the methodology proposed by Bandt and Pompe (2002). The database used in the present study was the ECG recordings obtained from PhysioNet, 47 long-term signals of patients with diagnosed cardiac arrhythmias and 18 long-term signals from normal sinus rhythm patients were processed. Average values of statistical complexity and normalized Shannon entropy were calculated and analyzed in the HxC plane for each time series. The average values of complexity of ECG for patients with diagnosed arrhythmias were bigger than normal sinus rhythm group. On the other hand, the Shannon entropy average values for arrhythmias patients were lower than the normal sinus rhythm group. This characteristic made it possible to discriminate the position of both signals’ groups in the HxC plane. The results were analyzed through a multivariate statistical test hypothesis. The methodology proposed has a remarkable conceptual simplicity, and shows a promising efficiency in the detection of cardiovascular pathologies.
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06697

Abstract:
In this talk, we propose an information theoretic approach to design the functional representations to extract the hidden common structure shared by a set of random variables. The main idea is to measure the common information between the random variables by the Watanabe's total correlation, and then find the hidden attributes of these random variables such that common information between these random variables is reduced the most given these hidden attributes. We show that these hidden attributes can be characterized by an exponential family specified by the eigen-decomposition of some pairwise joint distribution matrix. Then, we adopt the log-likelihood functions for estimating these hidden attributes as the desired functional representations of the random variables, and show that these functional representations are informative to describe the common structure. Moreover, we design both the multivariate alternative conditional expectation (MACE) algorithm to compute the proposed functional representations for discrete data, and a novel neural network training scheme for continuous or high-dimensional data. Finally, the performances of our algorithms are validated by numerical simulations in the MNIST digital recognition.
Published: 18 November 2019
by MDPI
Proceedings, Volume 46; doi:10.3390/ecea-5-06692

Abstract:
It has been conjectured that the origin of the fundamental molecules of life, their proliferation over the surface of Earth, and their complexation through time, are examples of photochemical dissipative structuring, dissipative proliferation, and dissipative selection, respectively, arising out of the nonequilibrium conditions created on Earth’s surface by the solar photon spectrum. Here I describe the nonequilibrium thermodynamics and the photochemical mechanisms involved in the synthesis and evolution of the fundamental molecules of life from simpler more common precursor molecules under the long wavelength UVC and UVB solar photons prevailing at Earth’s surface during the Archean. Dissipative structuring through photochemical mechanisms leads to carbon based UVC pigments with peaked conical intersections which endow them with a large photon disipative capacity (broad wavelength absorption and rapid radiationless dexcitation). Dissipative proliferation occurs when the photochemical dissipative structuring becomes autocatalytic. Dissipative selection arises when fluctuations lead the system to new stationary states (corresponding to different molecular concentration profiles) of greater dissipative capacity as predicted by the universal evolution criterion of Classical Irreversible Thermodynamic theory established by Onsager, Glansdorff, and Prigogine. An example of the UV photochemical dissipative structuring, proliferation, and selection of the nucleobase adenine from an aqueous solution of HCN under UVC light is given.
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06671

Abstract:
Ecosystems’ microbiome organization is the epitomic feature of ecosystem function and an incredibly fascinating system considering its complexity, ecology and evolution, and practical applications for individual and population health. Due to its ‘’unknowns’’ the microbiome also provides the opportunity to test and develop information theoretic models that mimic and predict its dynamics. A novel information and network theoretic model that predicts microbiome network organization, diversity, dynamics and stability for the human gut microbiome is presented. The model is able to classify health states based on microbiome entropic patterns, that, in the optimal biological function are related to neutral scale-free information organization of species interactions. The healthy state is characterized by an optimal metabolic function that is predicted by macroecological quintessential indicators whose variability is meaningful of state transitions. Information propagation analyses detect total species importance, proportional to outgoing information flow, which can be use for microbial engineering or disease diagnosis and etiognosis. Finally a link with ocean microbial ecosystems is highlighted as well as the collectivity-diversity-dynamics triality.
Proceedings of 5th International Electronic Conference on Entropy and Its Applications; doi:10.3390/ecea-5-06684

Abstract:
The new development in Artificial Intelligence, has significantly improved the quality and efficiency in generating fake face images; for example, the face manipulations by DeepFake is so realistic that it is difficult to distinguish the authenticity—either automatically or by humans. In order to enhance the efficiency of distinguishing facial images generated by AI from real facial ones, a novel model has been developed based on deep learning and ELA detection, which is related to entropy and information theory, such as cross-entropy loss function in the final Softmax layer, normalized mutual information in image preprocessing and some applications of encoder based on information theory. Due to the limitations of computing resources and production time, DeepFake algorithm can only generate limited resolutions, resulting in two different image compression ratios between the fake face area as the foreground and the original area as the background, which would leave distinctive artifacts. By using the error level analysis detection method, we can detect the presence or absence of different image compression ratios, and then use CNN to detect whether the image is fake. Experiments show that the training efficiency of CNN model can be significantly improved by using the ELA method. And the detection accuracy rate can reach more than 97% based on CNN architecture of this method. Compared to the state-of-the-art models, the proposed model has the advantages such as fewer layers, shorter training time and higher efficiency.