
PROCESS AND INFORMATION  ENTROPIES FOR COMPLEX SYSTEMRudolf A. Hanel (Personal webpage)Section for Science of Complex Systems, CeMSIIS, Medical University of Vienna
What entropy, information, complexity, story and the absence of free lunches have to do with each other?! Here we will try to scetch a comprehensive picture of what complexity science is about. We will discuss a fundamental source of uncertainty, frequently ignored. The impossibility to infer process purely from statistical information! Statistical means are fine for estimating system parameters once the class of processes is already known, but become useless and misleading when nothing is known about the process we sample. The work we need to invest into knowing the (defining) rules governing a process class of interest always seems to be independent from the work we have to invest into knowing the statistical properties of a process class. This reexpresses a fundamental observation (Wolpert & Macready 1995): There is no free lunch to be had. In order to know what works we have to know the context in which something works.  What does this tell us about our concepts of information and entropy? In the context of equilibrium systems entropy concepts all ultimately take the form of Shannon entropy, but describe distinct properties in the context of complex (e.g. driven, dissipative, pathdependent, or nonergodic) processes; e.g. Boltzmann entropy characterizing the maximum configuration of a process while extensive entropies describe its phasespace growth. 