No Cover Image

Journal article 583 views 77 downloads

Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems

Sanjay Pant Orcid Logo

Journal of The Royal Society Interface, Volume: 15, Issue: 142

Swansea University Author: Sanjay Pant Orcid Logo

  • APCE004.Pant.20170871.full.pdf

    PDF | Version of Record

    Distributed under the terms of a Creative Commons CC-BY 4.0 Licence.

    Download (1.17MB)

Check full text

DOI (Published version): 10.1098/rsif.2017.0871

Abstract

A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and...

Full description

Published in: Journal of The Royal Society Interface
ISSN: 1742-5689 1742-5662
Published: 2018
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa39546
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2018-04-23T13:59:47Z
last_indexed 2021-01-15T04:01:34Z
id cronfa39546
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2021-01-14T13:05:17.3217820</datestamp><bib-version>v2</bib-version><id>39546</id><entry>2018-04-23</entry><title>Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems</title><swanseaauthors><author><sid>43b388e955511a9d1b86b863c2018a9f</sid><ORCID>0000-0002-2081-308X</ORCID><firstname>Sanjay</firstname><surname>Pant</surname><name>Sanjay Pant</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2018-04-23</date><deptcode>MECH</deptcode><abstract>A new class of functions, called the &#x2018;information sensitivity functions&#x2019; (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters.</abstract><type>Journal Article</type><journal>Journal of The Royal Society Interface</journal><volume>15</volume><journalNumber>142</journalNumber><paginationStart/><paginationEnd/><publisher/><placeOfPublication/><isbnPrint/><isbnElectronic/><issnPrint>1742-5689</issnPrint><issnElectronic>1742-5662</issnElectronic><keywords/><publishedDay>31</publishedDay><publishedMonth>5</publishedMonth><publishedYear>2018</publishedYear><publishedDate>2018-05-31</publishedDate><doi>10.1098/rsif.2017.0871</doi><url/><notes>Correction available: http://rsif.royalsocietypublishing.org/content/15/143/20180353</notes><college>COLLEGE NANME</college><department>Mechanical Engineering</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>MECH</DepartmentCode><institution>Swansea University</institution><degreesponsorsfunders>EPSRC, EP/R010811/1</degreesponsorsfunders><apcterm/><funders>UKRI, ESPRC</funders><lastEdited>2021-01-14T13:05:17.3217820</lastEdited><Created>2018-04-23T10:12:32.5814928</Created><path><level id="1"/><level id="2"/></path><authors><author><firstname>Sanjay</firstname><surname>Pant</surname><orcid>0000-0002-2081-308X</orcid><order>1</order></author></authors><documents><document><filename>0039546-21092018152358.pdf</filename><originalFilename>APCE004.Pant.20170871.full.pdf</originalFilename><uploaded>2018-09-21T15:23:58.6470000</uploaded><type>Output</type><contentLength>1208919</contentLength><contentType>application/pdf</contentType><version>Version of Record</version><cronfaStatus>true</cronfaStatus><embargoDate>2018-09-21T00:00:00.0000000</embargoDate><documentNotes>Distributed under the terms of a Creative Commons CC-BY 4.0 Licence.</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents><OutputDurs/></rfc1807>
spelling 2021-01-14T13:05:17.3217820 v2 39546 2018-04-23 Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems 43b388e955511a9d1b86b863c2018a9f 0000-0002-2081-308X Sanjay Pant Sanjay Pant true false 2018-04-23 MECH A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters. Journal Article Journal of The Royal Society Interface 15 142 1742-5689 1742-5662 31 5 2018 2018-05-31 10.1098/rsif.2017.0871 Correction available: http://rsif.royalsocietypublishing.org/content/15/143/20180353 COLLEGE NANME Mechanical Engineering COLLEGE CODE MECH Swansea University EPSRC, EP/R010811/1 UKRI, ESPRC 2021-01-14T13:05:17.3217820 2018-04-23T10:12:32.5814928 Sanjay Pant 0000-0002-2081-308X 1 0039546-21092018152358.pdf APCE004.Pant.20170871.full.pdf 2018-09-21T15:23:58.6470000 Output 1208919 application/pdf Version of Record true 2018-09-21T00:00:00.0000000 Distributed under the terms of a Creative Commons CC-BY 4.0 Licence. true eng
title Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
spellingShingle Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
Sanjay Pant
title_short Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
title_full Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
title_fullStr Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
title_full_unstemmed Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
title_sort Information sensitivity functions to assess parameter information gain and identifiability of dynamical systems
author_id_str_mv 43b388e955511a9d1b86b863c2018a9f
author_id_fullname_str_mv 43b388e955511a9d1b86b863c2018a9f_***_Sanjay Pant
author Sanjay Pant
author2 Sanjay Pant
format Journal article
container_title Journal of The Royal Society Interface
container_volume 15
container_issue 142
publishDate 2018
institution Swansea University
issn 1742-5689
1742-5662
doi_str_mv 10.1098/rsif.2017.0871
document_store_str 1
active_str 0
description A new class of functions, called the ‘information sensitivity functions’ (ISFs), which quantify the information gain about the parameters through the measurements/observables of a dynamical system are presented. These functions can be easily computed through classical sensitivity functions alone and are based on Bayesian and information-theoretic approaches. While marginal information gain is quantified by decrease in differential entropy, correlations between arbitrary sets of parameters are assessed through mutual information. For individual parameters, these information gains are also presented as marginal posterior variances, and, to assess the effect of correlations, as conditional variances when other parameters are given. The easy to interpret ISFs can be used to (a) identify time intervals or regions in dynamical system behaviour where information about the parameters is concentrated; (b) assess the effect of measurement noise on the information gain for the parameters; (c) assess whether sufficient information in an experimental protocol (input, measurements and their frequency) is available to identify the parameters; (d) assess correlation in the posterior distribution of the parameters to identify the sets of parameters that are likely to be indistinguishable; and (e) assess identifiability problems for particular sets of parameters.
published_date 2018-05-31T03:53:39Z
_version_ 1737026562963800064
score 10.898149