# Publications by PierGianLuca Porta Mana

## Perfect detection of spikes in the linear sub-threshold dynamics of point neurons

Frontiers in Neuroinformatics **11** (2018)

© 2018 Krishnan, Porta Mana, Helias, Diesmann and Di Napoli. Spiking neuronal networks are usually simulated with one of three main schemes: the classical time-driven and event-driven schemes, and the more recent hybrid scheme. All three schemes evolve the state of a neuron through a series of checkpoints: equally spaced in the first scheme and determined neuron-wise by spike events in the latter two. The time-driven and the hybrid scheme determine whether the membrane potential of a neuron crosses a threshold at the end of the time interval between consecutive checkpoints. Threshold crossing can, however, occur within the interval even if this test is negative. Spikes can therefore be missed. The present work offers an alternative geometric point of view on neuronal dynamics, and derives, implements, and benchmarks a method for perfect retrospective spike detection. This method can be appliedto neuron models with affine or linear subthreshold dynamics. The idea behind the method is to propagate the threshold with a time-inverted dynamics, testing whether the threshold crosses the neuron state to be evolved, rather than vice versa. Algebraically this translates into a set of inequalities necessary and sufficient for threshold crossing. This test is slower than the imperfect one, but can be optimized in several ways. Comparison confirms earlier results that the imperfect tests rarely miss spikes (less than a fraction 1/108 of missed spikes) in biologically relevant settings.

## Bistability, non-ergodicity, and inhibition in pairwise maximum-entropy models.

PLoS computational biology **13** (2017) e1005762-

Pairwise maximum-entropy models have been used in neuroscience to predict the activity of neuronal populations, given only the time-averaged correlations of the neuron activities. This paper provides evidence that the pairwise model, applied to experimental recordings, would produce a bimodal distribution for the population-averaged activity, and for some population sizes the second mode would peak at high activities, that experimentally would be equivalent to 90% of the neuron population active within time-windows of few milliseconds. Several problems are connected with this bimodality: 1. The presence of the high-activity mode is unrealistic in view of observed neuronal activity and on neurobiological grounds. 2. Boltzmann learning becomes non-ergodic, hence the pairwise maximum-entropy distribution cannot be found: in fact, Boltzmann learning would produce an incorrect distribution; similarly, common variants of mean-field approximations also produce an incorrect distribution. 3. The Glauber dynamics associated with the model is unrealistically bistable and cannot be used to generate realistic surrogate data. This bimodality problem is first demonstrated for an experimental dataset from 159 neurons in the motor cortex of macaque monkey. Evidence is then provided that this problem affects typical neural recordings of population sizes of a couple of hundreds or more neurons. The cause of the bimodality problem is identified as the inability of standard maximum-entropy distributions with a uniform reference measure to model neuronal inhibition. To eliminate this problem a modified maximum-entropy model is presented, which reflects a basic effect of inhibition in the form of a simple but non-uniform reference measure. This model does not lead to unrealistic bimodalities, can be found with Boltzmann learning, and has an associated Glauber dynamics which incorporates a minimal asymmetric inhibition.

## Toward a stochastic parameterization of ocean mesoscale eddies

OCEAN MODELLING **79** (2014) 1-20

## Notes on affine and convex spaces

(2011)

These notes are a short introduction to affine and convex spaces, written especially for physics students. They try to connect and summarize the different elementary presentations available in the mathematical literature. References are also provided, as well as an example showing the relevance and usefulness of affine spaces in classical physics.

## Conjectures and questions in convex geometry (of interest for quantum theory and other physical statistical theories)

(2011)

Some conjectures and open problems in convex geometry are presented, and their physical origin, meaning, and importance, for quantum theory and generic statistical theories, are briefly discussed.

## On two recent conjectures in convex geometry

(2011)

Two conjectures recently proposed by one of the authors are disproved

## In favour of the time variable in classical thermoDYNAMICS

(2010)

A case for the teaching of classical thermodynamics with an explicit time variable, with phenomena involving changes in time, is made by presenting and solving a exercise in textbook style, and pointing out that a solution accords with experiment. The exercise requires an explicit treatment of the time variable. Further arguments are given for the advantages of an explicit time variable in classical thermodynamics, and against some standard terminology in this theory.

## On the relation between plausibility logic and the maximum-entropy principle: a numerical study

(2009)

What is the relationship between plausibility logic and the principle of maximum entropy? When does the principle give unreasonable or wrong results? When is it appropriate to use the rule `expectation = average'? Can plausibility logic give the same answers as the principle, and better answers if those of the principle are unreasonable? To try to answer these questions, this study offers a numerical collection of plausibility distributions given by the maximum-entropy principle and by plausibility logic for a set of fifteen simple problems: throwing dice.

## Effects of the g factor in semiclassical kinetic plasma theory

Physical Review Letters **101** (2008)

A kinetic theory for spin plasmas is put forward, generalizing those of previous authors. In the model, the ordinary phase space is extended to include the spin degrees of freedom. Together with Maxwell's equations, the system is shown to be energy conserving. Analyzing the linear properties, it is found that new types of wave-particle resonances are possible that depend directly on the anomalous magnetic moment of the electron. As a result, new wave modes, not present in the absence of spin, appear. The implications of our results are discussed. © 2008 The American Physical Society.

## `Plausibilities of plausibilities': an approach through circumstances

(2006)

Probability-like parameters appearing in some statistical models, and their prior distributions, are reinterpreted through the notion of `circumstance', a term which stands for any piece of knowledge that is useful in assigning a probability and that satisfies some additional logical properties. The idea, which can be traced to Laplace and Jaynes, is that the usual inferential reasonings about the probability-like parameters of a statistical model can be conceived as reasonings about equivalence classes of `circumstances' - viz., real or hypothetical pieces of knowledge, like e.g. physical hypotheses, that are useful in assigning a probability and satisfy some additional logical properties - that are uniquely indexed by the probability distributions they lead to.

## Numerical Bayesian state assignment for a three-level quantum system. I. Absolute-frequency data; constant and Gaussian-like priors

(2006)

This paper offers examples of concrete numerical applications of Bayesian quantum-state-assignment methods to a three-level quantum system. The statistical operator assigned on the evidence of various measurement data and kinds of prior knowledge is computed partly analytically, partly through numerical integration (in eight dimensions) on a computer. The measurement data consist in absolute frequencies of the outcomes of N identical von Neumann projective measurements performed on N identically prepared three-level systems. Various small values of N as well as the large-N limit are considered. Two kinds of prior knowledge are used: one represented by a plausibility distribution constant in respect of the convex structure of the set of statistical operators; the other represented by a Gaussian-like distribution centred on a pure statistical operator, and thus reflecting a situation in which one has useful prior knowledge about the likely preparation of the system. In a companion paper the case of measurement data consisting in average values, and an additional prior studied by Slater, are considered.

## Consistency of the Shannon entropy in quantum experiments

Physical Review A - Atomic, Molecular, and Optical Physics **69** (2004)

The consistency of Shannon entropy applied to the statistical outcomes of quantum experiments was analyzed. The Shannon entropy was found to be fully consistent and its properties were not violated in quantum experiments. It was found that a change in the temporal order of the experiments led to change in entropy values of quantum experiments. It was observed that Shannon entropy is always context dependent in quantum experiments.

## A size criterion for macroscopic superposition states

Journal of Optics B: Quantum and Semiclassical Optics **6** (2004) 429-436

## Schrödinger-cat states - Size classification based on evolution or dissipation

Proceedings of SPIE - The International Society for Optical Engineering **5468** (2004) 335-343

The issue of estimating how "macroscopic" a superposition state is, can be addressed by analysing the rapidity of the state's evolution under a preferred observable, compared to that of the states forming the superposition. This fast evolution, which arises from the larger dispersion of the superposition state for the preferred operator, also represents a useful characteristic for interferometric applications. This approach can be compared to others in which a superposition's macroscopality is estimated in terms of the fragility to dissipation.

## Hamiltonians for a general dilaton gravity theory on a spacetime with a non-orthogonal, timelike or spacelike outer boundary

Classical and Quantum Gravity **18** (2001) 779-792

A generalization of two recently proposed general relativity Hamiltonians, to the case of a general (d + 1)-dimensional dilaton gravity theory in a manifold with a timelike or spacelike outer boundary, is presented.

## Maximum-entropy and representative samples of neuronal activity: a dilemma

ArXiv (0)

The present work shows that the maximum-entropy method can be applied to a sample of neuronal recordings along two different routes: (1) apply to the sample; or (2) apply to a larger, unsampled neuronal population from which the sample is drawn, and then marginalize to the sample. These two routes give inequivalent results. The second route can be further generalized to the case where the size of the larger population is unknown. Which route should be chosen? Some arguments are presented in favour of the second. This work also presents and discusses probability formulae that relate states of knowledge about a population and its samples, and that may be useful for sampling problems in neuroscience.

## Inferring health conditions from fMRI-graph data

ArXiv (0)

Automated classification methods for disease diagnosis are currently in the limelight, especially for imaging data. Classification does not fully meet a clinician's needs, however: in order to combine the results of multiple tests and decide on a course of treatment, a clinician needs the likelihood of a given health condition rather than binary classification yielded by such methods. We illustrate how likelihoods can be derived step by step from first principles and approximations, and how they can be assessed and selected, illustrating our approach using fMRI data from a publicly available data set containing schizophrenic and healthy control subjects. We start from the basic assumption of partial exchangeability, and then the notion of sufficient statistics and the "method of translation" (Edgeworth, 1898) combined with conjugate priors. This method can be used to construct a likelihood that can be used to compare different data-reduction algorithms. Despite the simplifications and possibly unrealistic assumptions used to illustrate the method, we obtain classification results comparable to previous, more realistic studies about schizophrenia, whilst yielding likelihoods that can naturally be combined with the results of other diagnostic tests.