Sistemas dinámicos

Sistemas dinámicos

45 bookmarks
Custom sorting
Usos y abusos de las matemáticas en biología
Usos y abusos de las matemáticas en biología
  • Las matemáticas han sido menos intrusivas en las ciencias de la vida, posiblemente porque hasta hace poco tiempo han sido en gran medida descriptivas, carentes de los principios de invariancia y las constantes naturales fundamentales de la física. - Modelos de evolución, ecología y epidemiología - Aplicaciones de las matemáticas en la secuenciación del genoma humano y de otros genomas. - La virtud de las matemáticas es que impone claridad y precisión a las conjeturas, lo que permite una comparación significativa entre las consecuencias de los supuestos básicos y lo empírico. - Los modelos matemáticos han demostrado tener muchos usos y adoptar muchas formas en las ciencias de la vida. - Sin embargo, las matemáticas no tienen la relación duradera con las ciencias de la vida que tienen con las ciencias físicas y la ingeniería. - Los abusos más comunes, y no siempre fáciles de reconocer, son las situaciones en las que los modelos matemáticos se construyen con una abundancia insoportable de detalles en algunos aspectos, mientras que otras facetas importantes del problema son confusas o un parámetro vital es incierto dentro de un orden de magnitud.
·researchgate.net·
Usos y abusos de las matemáticas en biología
Modelos en la ciencia
Modelos en la ciencia
Models are of central importance in many scientific contexts.
Scientists spend significant amounts of time building, testing, comparing, and revising models
Phenomenological models have been defined in different, although related, ways. A common definition takes them to be models that only represent observable properties of their targets and refrain from postulating hidden mechanisms and the like
Exploratory models are models which are not proposed in the first place to learn something about a specific target system or a particular experimentally established phenomenon.
A model of data (sometimes also “data model”) is a corrected, rectified, regimented, and in many instances idealized version of the data we gain from immediate observation
Some models are physical objects. Such models are commonly referred to as “material models”
If a model is constructed in a domain where no theory is available, then the model is sometimes referred to as a “substitute model”
Models as a means to explore theory
Models as complements of theories
Models as preliminary theories
Models as mediators
·plato.stanford.edu·
Modelos en la ciencia
Interpretación errónea de los intervalos de confianza
Interpretación errónea de los intervalos de confianza
  • Our findings suggest that many researchers do not know the correct interpretation of a CI. - The misunderstandings surrounding pvalues and CIs are particularly unfortunate. - Null hypothesis significance testing (NHST) has been criticized for many reasons, including its inability to provide the answers that researchers are interested in. - Within the frequentist framework, a popular alternative to NHST is inference by CIs. CIs are often claimed to be a better and more useful alternative to NHST. - CIs provide information on any hypothesis, whereas NHST is informative only about the null. - CIs give direct insight into the precision of the procedure and can therefore be used as an alternative to power calculations. - The findings above show that people interpret data differently depending on whether these data are presented through NHST or CIs. - A CI is a numerical interval constructed around the estimate of a parameter. Such an interval does not, however, directly indicate a property of the parameter; instead, it indicates a property of the procedure, as is typical for a frequentist technique. - It is incorrect to interpret a CI as the probability that the true value is within the interval - Correct interpretation: If we were to repeat the experiment over and over, then 95 % of the time the confidence intervals contain the true mean.
·ejwagenmakers.com·
Interpretación errónea de los intervalos de confianza
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
  • Mathematically, it is not an ordinary function - It can informally be interpreted as the derivative of the Heaviside unit step function - Can be defined as a generalized function or a distribution or a function - Dirac’s 1920s original mathematical intuition and his systematic use of the Dirac delta function in quantum mechanics became an effective and useful mathematical tool in mathematical physics and applied mathematics and led him to state strongly that it must be true due to its clarity and elegance. In order to provide a proper mathematical justification of the Dirac delta function, S. L. Sobolev (1908–1989), a famous Russian mathematician, first introduced the idea of generalized functions in 1936. About 14 years later, it became a whole new discipline of theory of distributions. - Rectangular (or Dirac) delta sequence, (b) the Gaussian δ-sequence,
·tandfonline.com·
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
Rigidez de las ecuaciones diferenciales ordinarias
Rigidez de las ecuaciones diferenciales ordinarias
Stiffness is a subtle concept that plays an important role in assessing the effectiveness of numerical methods for ordinary differential equations. (This article is adapted from section 7.9,
A problem is stiff if the solution being sought varies slowly, but there are nearby solutions that vary rapidly, so the numerical method must take small steps to obtain satisfactory results.
Nonstiff methods can solve stiff problems; they just take a long time to do it.
For truly stiff problems, a stiff solver can be orders of magnitude more efficient, while still achieving a given accuracy
You don't want to change the differential equation or the initial conditions, so you have to change the numerical method.
·blogs.mathworks.com·
Rigidez de las ecuaciones diferenciales ordinarias
Verificación y validación de los modelos de simulación por ordenador
Verificación y validación de los modelos de simulación por ordenador
Los modelos de simulación son imitaciones aproximadas de los sistemas del mundo real y nunca imitan exactamente el sistema del mundo real
la verificación de un modelo es el proceso de confirmar que se implementa correctamente con respecto al modelo conceptual (coincide con las especificaciones y los supuestos que se consideran aceptables para el propósito de la aplicación)
La validación verifica la precisión de la representación del modelo del sistema real.
·es.wikipedia.org·
Verificación y validación de los modelos de simulación por ordenador
Simulaciones por computador en la ciencia
Simulaciones por computador en la ciencia
The list of sciences that make extensive use of computer simulation has grown to include astrophysics, particle physics, materials science, engineering, fluid mechanics, climate science, evolutionary biology, ecology, economics, decision theory, medicine, sociology, epidemiology, and many others.
In its narrowest sense, a computer simulation is a program that is run on a computer and that uses step-by-step methods to explore the approximate behavior of a mathematical model. Usually this is a model of a real-world system (although the system in question might be an imaginary or hypothetical one)
More broadly, we can think of computer simulation as a comprehensive method for studying systems.
Equation-based Simulations
Agent-based simulations
But some simulation models are hybrids of different kinds of modeling methods. Multiscale simulation models, in particular, couple together modeling elements from different scales of description.
Monte Carlo Simulations
MC simulations are computer algorithms that use randomness to calculate the properties of a mathematical model and where the randomness of the algorithm is not a feature of the target model.
Purposes of Simulation
What warrants our taking a computer simulation to be a severe test of some hypothesis about the natural world?
Verification can be divided into solution verification and code verification.
The principal strategy of validation involves comparing model output with observable data.
A simulation that accurately mimics a complex phenomenon contains a wealth of information about that phenomenon.
Philosophers, consequently, have begun to consider in what sense, if any, computer simulations are like experiments and in what sense they differ.
computer simulations have profound implications for our understanding of the structure of theories
The identity thesis. Computer simulation studies are literally instances of experiments.
·plato.stanford.edu·
Simulaciones por computador en la ciencia
Analogía electromecánica
Analogía electromecánica
Procedimiento para modelizar sistemas eléctricos mediante la combinación de elementos mecánicos y al revés
Las analogías electromecánicas se utilizan para modelizar el funcionamiento de un sistema mecánico mediante un sistema eléctrico equivalente, estableciendo analogías entre parámetros mecánicos y eléctricos.
Una analogía electromecánica consiste en la representación de un sistema mecánico mediante un circuito eléctrico.
Este enfoque es especialmente útil en el diseño de filtros mecánicos, ya que se utilizan simples dispositivos eléctricos para emular sistemas mecánicos mucho más caros y complejos.
·es.wikipedia.org·
Analogía electromecánica
Simulación de la propagación de una enfermedad en diferentes escenarios
Simulación de la propagación de una enfermedad en diferentes escenarios
El goteo inicial de nuevas infecciones de coronavirus se ha transformado en una corriente continua. Podemos aprender cómo frenarla a través de sencillas simulaciones.
el distanciamiento social moderado normalmente funciona mejor que el intento de cuarentena
el distanciamiento social exhaustivo suele funcionar mejor que cualquier otro
·washingtonpost.com·
Simulación de la propagación de una enfermedad en diferentes escenarios
Cuando todos los modelos están equivocados
Cuando todos los modelos están equivocados
More stringent quality criteria are needed for models used at the science/policy interface. Here is a checklist to aid in the responsible development and use of models.
tal escepticismo parece cada vez más común y cada vez más independiente de la posición ideológica
¿el tono de este tipo de ataques refleja un colapso de la confianza en la empresa científica y en su papel social e institucional?
la sociedad puede estar menos dispuesta a aceptar tales afirmaciones que en el pasado
los peligros para la ciencia se hacen más evidentes cuando los modelos —resúmenes de problemas más complejos del mundo real, generalmente expresados en términos matemáticos— se utilizan como herramientas de política.
serie particularmente accesible de historias de terror sobre el mal uso de los modelos y el consiguiente fracaso de las políticas
la precisión y el valor de las predicciones en sí mismas terminan estando en el centro de los debates políticos y distraen de la necesidad y la capacidad de abordar el problema a pesar de las incertidumbres actuales.
"todos los modelos son erróneos, pero algunos son útiles"
se deben adoptar criterios estrictos de transparencia cuando los modelos se utilizan como base para las evaluaciones de políticas
cuanto más se entiende el clima, más inciertas se vuelven las predicciones de los modelos de futuros climáticos específicos
Muchas pequeñas incertidumbres multiplicadas juntas producen enormes incertidumbres agregadas.
El desafío se vuelve aún más desalentador cuando los modeladores centran su atención en las consecuencias económicas de los cambios en la composición atmosférica.
Semejante esfuerzo está tan alejado de la capacidad predictiva actual que raya en lo irresponsable.
la mayoría de los aspectos de la mayoría de los modelos no estarán sujetos a un escrutinio tan minucioso
se basan en lo que los modelos pronostican sobre el futuro, con poca o ninguna sensibilidad a los límites de lo que los modelos son realmente capaces de pronosticar con precisión.
proponemos siete reglas que, en su conjunto, forman una lista de verificación para el desarrollo y uso responsable de los modelos.
análisis de sensibilidad global
El análisis de sensibilidad global evalúa la importancia relativa de los factores de entrada en términos del impacto de su incertidumbre en la salida del modelo.
¿Cuál de esas incertidumbres tiene el mayor impacto en el resultado?
"auditoría de sensibilidad"
La "auditoría" hace hincapié en la idea de la rendición de cuentas a un público más amplio (en este caso, los responsables de la formulación de políticas y el público) y, por lo tanto, exige que el modelo sea accesible y transparente y que la experiencia no se defina de manera estrecha para excluir a todos excepto a quienes crearon el modelo
La auditoría de sensibilidad no tiene como objetivo mejorar el modelo; Más bien, al igual que una auditoría fiscal, se produce al final del proceso, en el momento en que el modelo se convierte en una herramienta para la evaluación de políticas, cuando todos los desarrolladores han llevado a cabo toda la calibración posible del modelo, la optimización, la asimilación de datos y similares utilizando las herramientas de su oficio.
Regla 1: Usa los modelos para aclarar, no para oscurecer.
La Regla 1 prescribe que se deben plantear preguntas sobre quién se beneficia del modelo y qué motivaciones e incentivos animan a los desarrolladores de modelos.
Regla 2: Adopta una actitud de "caza de suposiciones"
Se requieren nuevos procesos para el desarrollo y uso de modelos, en los que las partes interesadas comprometidas trabajen con expertos disciplinarios para desarrollar nuevos modelos que puedan utilizarse para probar diversas opciones de política
exige que los modelos utilizados para guiar las decisiones regulatorias se pongan a disposición de un tercero para permitir la evaluación del impacto de cambiar los supuestos o parámetros de entrada en la conclusión basada en el modelo.
Regla 3: Detectar la pseudociencia.
definimos la pseudociencia como la práctica de ignorar u ocultar las incertidumbres en las entradas del modelo para garantizar que las salidas del modelo se puedan vincular a las opciones de política preferidas.
Un indicador común de este tipo de pseudociencia es la precisión espuria, por ejemplo, cuando se da un resultado con un número de dígitos que excede (a veces ridículamente) cualquier estimación plausible de la precisión asociada.
C. F. Gauss, "la falta de cultura matemática no se revela en ninguna parte de manera tan conspicua como en la precisión sin sentido en los cálculos numéricos"
No publique sus inferencias sin haber realizado una cuidadosa auditoría de sensibilidad.
Regla 4: Encuentra suposiciones sensibles antes de que te encuentren.
"Confesarás en presencia de la sensibilidad. Corolario: Anticiparás la crítica"
Cualquier inferencia basada en modelos que se introduzca en el entorno político sin ir acompañada de un análisis de sensibilidad técnicamente sólido debe considerarse sospechosa.
<b _istranslated="1">Regla 5: Apunta a la transparencia.</b> Las partes interesadas deben ser capaces de dar sentido y, si es posible, replicar los resultados del análisis.
las representaciones de modelos simples o parsimoniosos son mejores que los modelos más "sofisticados" o complejos, cuando se utilizan para evaluaciones de impacto de políticas.
<b _istranslated="1">Regla 6: No te limites a "hacer bien las sumas", sino "hacer las sumas correctas".</b> Cuando se descuidan los puntos de vista relevantes de las partes interesadas, los modeladores pueden centrarse o abordar las incertidumbres equivocadas.
<font _mstmutation="1" _msttexthash="5643729" _msthash="402">un error de tipo uno es un falso positivo: se determina que una práctica es insegura cuando es segura</font><span class="diigoHighlightCommentLocator"></span>, o que una intervención no es beneficiosa cuando es beneficiosa. <font _mstmutation="1" _msttexthash="1845259" _msthash="404">Un error de tipo dos es lo contrario: un falso negativo.</font><span class="diigoHighlightCommentLocator"></span> <font _mstmutation="1" _msttexthash="14838902" _msthash="406">Un error de tipo tres, por el contrario, es aquel en el que el análisis en sí mismo se enmarca incorrectamente y, por lo tanto, el problema se caracteriza erróneamente.</font>
es fácil caer en lo que podemos llamar "lamp-posting", en el que las incertidumbres o parámetros que se examinan con más detenimiento son los menos relevantes pero más fáciles de analizar
<b _istranslated="1">Regla 7: Enfoca el análisis.</b> No haga análisis de sensibilidad superficiales, simplemente cambiando un factor a la vez.
Por ejemplo, en un sistema con 10 factores inciertos, si se mueve solo uno a la vez, se corre el riesgo de explorar solo una pequeña parte de la incertidumbre de entrada potencial total.
la elección de una línea de base es en sí misma un proceso cargado de suposiciones y, por lo tanto, está sujeta a críticas y análisis de sensibilidad.
Una auditoría de sensibilidad creíble no debe estar anclada a bases de referencia que son en sí mismas subjetivas; Deben evaluar el efecto de cualquier entrada, mientras que todas las demás entradas también pueden variar.
Nuestra opinión es que las prácticas actuales de modelización, en su desarrollo y uso, son una amenaza significativa para la legitimidad y la utilidad de la ciencia en entornos políticos controvertidos.
·issues.org·
Cuando todos los modelos están equivocados
¿Por qué necesitamos solucionadores de EDO rígidos?
¿Por qué necesitamos solucionadores de EDO rígidos?
I hope you have noticed the new MATLAB Central blog: Cleve's Corner. Cleve once said
the output is flat most of the time, but shows a very fast sudden transition. This is typical of stiff systems.
compare the results between ode45 and ode23s
ode45 is not able to recover after the fast transition and keeps taking very small steps. In this specific example, ode45 required 3046 steps to solve the problem, while ode23s required only 91 steps! Ode23s may do more computations per step, but it can take fewer steps than ode45.
·blogs.mathworks.com·
¿Por qué necesitamos solucionadores de EDO rígidos?
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Without validation data, more complex models may succumb to uncertainty.
modelers can better ponder whether the addition of detail truly matches the model’s purpose
transmission model of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), based on more than 900 parameters
the development of finer-grained models often proceeds without having, at the scale required, specific data available to train or validate the model
modelers cannot benefit from existing statistical instruments that help balance model complexity with error so as to align with science’s quest for parsimony, such as Akaike’s (13) or Schwarz’s (14) information criterion
modelers can gauge the connection between model complexity and uncertainty at all stages of model development by calculating the model’s “effective dimensions,” that is, the number of influential parameters and active higher-order effects
effective dimensions
the addition of model detail in process-based models tends to produce more (and not less) uncertain estimates because it increases the model’s effective dimensions, which generally boost the output variance
statistical theory behind the connection between model uncertainty, complexity, and the notion of effective dimensions
how the concept of effective dimensions can help modelers balance model complexity with uncertainty
number of parameters and the pattern of their connections as key contributors to the complexity of mathematical models
aggregate complexity
The effective dimension ks is therefore the order of the highest effect that needs to be included in Eq. 1 to reach p
more complex models will generally display a higher effective dimension in kt and ks, an increase that promotes a larger output uncertainty
a model with more parameters does not necessarily lead to a larger uncertainty if n ≠ k
classic susceptible-infected-recovered (SIR) model is often enhanced with the addition of fine-grained features such as seasonality or age stratification
If the goal of the model is to gain insights into the effects that vaccination and nonpharmaceutical interventions may have on the spread of the virus, then the SIR(S-V) might be preferred over the more complex SIR(S-E) because the extra detail in the latter does not help clarify potential courses of action
importance of stripping mathematical models of superfluous parameters, processes, or linkages
Ti includes all terms in Eq. 3 with the index i
·science.org·
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Ciencia en la era posnormal
Ciencia en la era posnormal
En mi opinión, la ciencia posnormal presta más atención a la influencia de las personas en forma de una incertidumbre en un tipo especial de entradas: las perturbaciones (influencia, ideología, ética, política, intereses).
Post-Normal Science (PNS)
facts uncertain, values in dispute, stakes high and decisions urgent
Los factores son inciertos, hay valores en disputa, los riesgos son altos y las decisiones urgentes
it really applies to politics, and not to science
correctly defining the characteristics of both the world, and the relevant science
Post-Normal Science
https://en.wikipedia.org/wiki/Post-normal_science
there are essential social and political dimensions of the problem-solving practice
We use the two attributes of systems uncertainties and decision stakes to distinguish among these.
traditional methodologies are ineffective
The reductionist, analytical worldview which divides systems into ever smaller elements, studied by ever more esoteric specialism, is being replaced by a systemic, synthetic and humanistic approach
In this ‘normal’ state of science, uncertainties are managed automatically, values are unspoken, and foundational problems unheard of.
uncertainty is not banished but is managed, and values are not presupposed but are made explicit
quality of scientific information
problem-solving strategies, analysed in terms of uncertainties in knowledge and complexities in ethics.
uncertainty cannot be banished from science; but that good quality of information depends on good management of its uncertainties
applied science is ‘mission-oriented’; professional consultancy is ‘client-serving’; and post-normal science is ‘issue-driven’
issue-driven
basado en problemas
their resolution requires new conceptions of scientific methodology
different kinds of uncertainty can be expressed, and used for an evaluation of quality of scientific information
three zones representing and characterizing three kinds of problem-solving strategies
the research exercise is generally not undertaken unless there is confidence that the uncertainties are low, that is that the problem is likely to be soluble by a normal, puzzle-solving approach
professional consultancy includes applied science
uncertainty is at the methodological level
will be in conflict, involving various human stakeholders and natural systems as well
Of engineering we could say that most routine engineering practice is a matter of empirical craft skills using the results of applied science, while at its highest levels it becomes true professional consultancy.
The outcomes of applied science exercises, like those of core science, have the features of reproducibility and prediction.
professional tasks deal with unique situations, however broadly similar they may be.
it would be unrealistic to expect two safety engineers to produce the same model (or the same conclusions) for a hazard analysis of a complex installation.
these policy issues involve professional consultancy, such disagreements should be seen as inevitable and healthy.
We can envisage four components in the problem-solving task; the process, the product, the person and the purpose
Persona ↓ ---------> Proceso ---------> Purpose Product
In professional consultancy there can be no simple, objective criteria or processes for quality assurance (beyond simple competence)
post-normal science occurs when uncertainties are either of the epistemological or the ethical kind, or when decision stakes reflect conflicting purposes among stakeholders
post-normal science is indeed a type of science, and not merely politics or public participation
Because of the high level of uncertainty, approaching sheer ignorance in some cases
The uncertainties go beyond those of the systems, to include ethics as well.
In post-normal science, the manifold uncertainties in both products and processes require that the relative importance of persons becomes enhanced.
It is important to appreciate that post-normal science is complementary to applied science and professional consultancy. It is not a replacement for traditional forms of science
·commonplace.knowledgefutures.org·
Ciencia en la era posnormal
Una cronología del análisis de sensibilidad
Una cronología del análisis de sensibilidad
The last half a century has seen spectacular progresses in computing and modelling in a variety of fields, applications, and methodologies. Over the s…
evolution from local to global methods
future directions and areas of growth in the field
design of experiments
While uncertainty analysis studies the uncertainty in the output, sensitivity analysis studies how the uncertainty in the output can be allocated to the different sources of uncertainty in the input
Sensitivity analysis serves various purposes, including model validation, dimensionality reduction, prioritization of research efforts, pinpointing critical regions within the space of uncertainties under investigation, and aiding decision-making by quantifying how input variations impact outcome uncertainty
independent discipline recognised also by institutional guidelines (European Commission
This “global” understanding started in the 1970s
the sensitivity analysis panorama is still dominated in practically all disciplines by the so-called "local" approaches
tension still characterizing different schools of sensitivity analysis
a good date to set the start of sensitivity analysis is 1905, when Karl Pearson, the founder of modern statistics, proposed the idea of correlation ratio (known as the η2 index)
formalization of the experimental design in the 1920s and 1930s by the statistician Ronald Fisher.
process whereby statistics managed to adjudicate the authority to assess the ‘realism of causes’
World War 2 provided a significant impetus for the expansion and application of sensitivity analysis within the field of operational research
Experimental design continued to develop with several important advancements in this field in the 1950s, including the widespread adoption of factorial designs
Fourier amplitude sensitivity test (FAST) in the early 1970s
At the early stages of the 80's, a notable contribution was made by Efron and Stein (1981) on the variance decomposition into terms of increasing dimensionality
One of the key breakthroughs during this time was the adoption of random sampling techniques
In 1993, Sobol’ introduced an innovative approach to sensitivity analysis based on the decomposition of the output variance
Towards the end of the 1990s, a brand-new community of sensitivity-analysis practitioners emerged, reflecting on the concept of “global sensitivity analysis”
introduced the innovative winding stairs method for computing higher-order effects
emphasizing its application in various settings like factor prioritization, factor fixing, and variance reduction, stressing the need for global methods in order to treat non-linear and non-additive models
A first international conference on global sensitivity analysis (SAMO) was organized in 1995
local sensitivity analysis methods remained prevalent across disciplines
SA practitioners also started to compare the performance of sensitivity analysis methods using SA itself
researchers focused on developing methods for sensitivity analysis of computationally expensive models
The trend toward model complexification emphasized the importance of using sensitivity analysis to ensure accurate and reliable model outputs.
The COVID-19 pandemic was partly instrumental with this development, leading several authors to question the political use of models
Sensitivity analysis and auditing have recently been proposed as tools to jointly match the double demand for technical and normative quality in modelling
the penetration of global sensitivity analysis methods into the broader modelling community has not reached its full potential
Notably absent fields include finance and economics, and, to a lesser extent, medicine and related fields such as psychology and neuroscience
hypothesis testing (e.g Dunnett test), which are capable of answering similar questions.
If sensitivity analysis has not been widely adopted or promoted within a particular discipline, researchers might be less inclined to explore its potential benefits
there is a growing demand for user-friendly tools
In the future, sensitivity analysis is expected to play a pivotal role in guiding model development and decision-making processes, especially as simulation models become increasingly bigger and more complex
·sciencedirect.com·
Una cronología del análisis de sensibilidad
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Pandemic politics highlight how predictions need to be transparent and humble to invite insight, not blame.
predictions need to be transparent and humble to invite insight, not blame.
The COVID-19 pandemic illustrates perfectly how the operation of science changes when questions of urgency, stakes, values and uncertainty collide — in the ‘post-normal’ regime.
decision makers and citizens need to establish new social norms
Here we present a manifesto for best practices for responsible mathematical modelling
Mind the assumptions Assess uncertainty and sensitivity.
One way to mitigate these issues is to perform global uncertainty and sensitivity analyses.
Mind the hubris Complexity can be the enemy of relevance.
there is a trade-off between the usefulness of a model and the breadth it tries to capture
As more parameters are added, the uncertainty builds up (the uncertainty cascade effect)
The complexity of a model is not always an indicator of how well it captures the important features.
the time needed for water to percolate down to the underground repository level — was uncertain by three orders of magnitude, rendering the size of the model irrelevant
no one can be held accountable if the predictions are catastrophically wrong
Mind the framing Match purpose and context.
No one model can serve all purposes.
all presuppose a set of values about what matters
how to produce a model, assess its uncertainty and communicate the results.
Whenever a model is used for a new application with fresh stakeholders, it must be validated and verified anew.
Mind the consequences Quantification can backfire.
Undiscriminating use of statistical tests can substitute for sound judgement.
Spurious precision adds to a false sense of certainty.
some might imagine a confidence of two significant digits
trust is essential for numbers to be useful
Mind the unknowns Acknowledge ignorance.
models can hide ignorance
Experts should have the courage to respond that “there is no number-answer to your question”
Questions not answers Mathematical models are a great way to explore questions.
Asking models for certainty or consensus is more a sign of the difficulties in making controversial decisions than it is a solution
good modelling cannot be done by modellers alone. It is a social activity.
Following these five points will help to preserve mathematical modelling as a valuable tool.
·nature.com·
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Modelos: un estado de excepción
Modelos: un estado de excepción
Models live in a state of exception. Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, bet...
Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, better than other cases of quantification, the lenses of sociology and other humanistic disciplines
Models are thus both underexplored and overinterpreted
more actors should engage in practices such as assumption hunting, modelling of the modelling process, and sensitivity analysis and auditing
The state of exception also results from the pretence of neutrality customarily associated with mathematics, and from the asymmetry between developers and users.
‘Funny Numbers’ produced by econometricians to assess the risk of financial instruments
reciprocal domestication between models and society
models are special compared to other families of quantification
Modellers are regarded as endowed with privileged access to the foundations of reality.
Models do not meet classic (Popperian) criteria of scientificity.
a major problem in our use of mathematical models lies in assimilating them to physical laws, and hence treating their predictions with the same dignity
models act as integrators of a broad array of ingredients, including theoretical notions, mathematical concepts and techniques, stylized facts, empirical data, policy views, analogies and metaphors.
models are like metaphors that help us understand how we see the world
consequences of being special
asymmetry between developers and users
models universally known to be wrong continue to play a role in economic policy decisions
Models lend themselves to trans-science. Trans-science refers to scientific practices that appear to be formulated in the language of science, but that science cannot solve because of their sheer complexity or insufficient knowledge
Vulnerability to modelling hubris
models that are too complex (burdened by an excessive number of estimated parameters) may lead to greater imprecision due to error propagation.
Thinking about the reproducibility of models
malpractices such as HARK [Hypothesis After the Results are Known (Kerr Citation 1998)]
In practice, it is safest to treat each run of a model as an experiment.
Sensitivity analysis and sensitivity auditing
Reforming modelling will involve both the adoption of methods in the practice of research and their incorporation into teaching syllabuses.
Uncertainty quantification should be at the heart of the scientific method
there is an evident need for greater clarity on how risk numbers are computed.
The notion that a model can be an avenue of possibly instrumental ‘displacement’, in the sense of moving the attention from the system to its model, as discussed in Rayner ( Citation 2012), is still too radical for many practitioners to contemplate.
preregistration of modelling studies (Ioannidis Citation 2022) is still a long way off
·tandfonline.com·
Modelos: un estado de excepción
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet…
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling.
Sensitivity analysis (SA), in the most general sense, is the study of how the ‘outputs’ of a ‘system’ are related to, and are influenced by, its ‘inputs’.
‘factors’ in SA, may include model parameters, forcing variables, boundary and initial conditions, choices of model structural configurations, assumptions and constraints.
scientific discovery to explore causalities
dimensionality reduction
decision support
It has roots in ‘design of experiments’ (DOE) which is a broad family of statistical methods
We believe that SA is en route to becoming a mature and independent, but interdisciplinary and enabling, field of science.
opinions on the possible and desirable future evolutions of SA science and practice
The modern era of SA has focused on a notion that is commonly referred to as ‘Global Sensitivity Analysis (GSA)’
Such measures are said to be ‘derivative-based’ as they either analytically compute derivatives or numerically quantify the change in output when factors of interest (continuous or discrete) are perturbed around a point.
The full variance-based SA framework was laid down by Ilya Sobol’ in 1993
One persistent issue in SA is that nearly all applications, regardless of the method used, rest on the assumption that inputs are uncorrelated
ignoring correlation effects and multivariate distributional properties of inputs largely biases, or even falsifies, any SA results
Applications of SA are widespread across many fields, including earth system modeling (Wagener and Pianosi, 2019), engineering (Guo et al., 2016), biomechanics (Becker et al., 2011), water quality modeling (Koo et al., 2020a and 2020b), hydrology (Shin et al., 2013; Haghnegahdar and Razavi, 2017), water security (Puy et al., 2020c), nuclear safety (Saltelli and Tarantola, 2002; Iooss and Marrel, 2019) and epidemiology
it is not a formally recognized discipline
its application in some fields might appear under other titles.
why do I need to run SA for a given problem and what is the underlying question that SA is expected to answer?
how should I design the SA experiment to address that underlying question?
Teach SA more broadly and consistently
a dominant application of SA is for parameter screening, to support model calibration by identifying and fixing non-influential parameters.
Management of uncertainty through its characterization and attribution should be at the heart of the scientific method and, a fortiori, in the use of science for policy
while models are becoming more and more complex, they are treated more and more like a black-box, even by model developers themselves.
SA has significant potential to help in diagnosing the behavior of a mathematical model and for assessing how plausibly the model mimics the system under study for the given application.
To diagnostically test a model, one may compare SA results with expert knowledge on how the underlying system being modeled works.
Most models are poorly-identifiable, largely because of over-parameterization relative to the data and information available
SA and identifiability analysis (IA) are different but complementary
an insensitive parameter is non-identifiable, but the converse is not necessarily true, that is, a sensitive parameter may or may not be identifiable.
Model reduction, however, should be done with caution, as a parameter that seems non-influential under a particular condition might become quite influential under a new condition
fixing parameters that have small sensitivity indices may result in model variations that cannot be explained in the lower dimensional space
Development of research-specific software is at the core of modern modeling efforts.
Computational burden has been a major hindrance to the application of modern SA methods to real-world problems.
The application of SA with machine learning is further complicated because of the fundamental differences between machine learning and other types of models
calls for mutual trust between model developers and end users
The future, therefore, needs new generations of algorithms to keep pace with the ever-increasing complexity and dimensionality of the state-of-the-art models.
A complete assessment of the computational performance of any SA algorithm must be conducted across four aspects: efficiency, convergence, reliability and robustness.
an SA algorithm is robust to sampling variability if its performance remains almost ‘identical’ when applied on two different sample sets taken from the same model.
bootstrapping (Efron, 1987) is often used with SA algorithms to estimate robustness in the form of uncertainty distributions on sensitivity indices without requiring additional model evaluations.
future of SA may step more towards ‘sampling-free’ algorithms that can work on any ‘given data’
More recently, authors have proposed parameter estimation procedures based on nearest neighbors (Broto, 2020), rank statistics (Gamboa et al., 2020) and robustness-based optimization (Sheikholeslami and Razavi, 2020)
Higher dimensionality exacerbates the difficulty of assigning multivariate distributions to uncertain inputs
cases require excessively large sample sizes
sampling strategies
uncertainty estimate, it is notable that a minority of works apply this quantification systematically
Sensitivity analysis of sensitivity analysis
informal (and often local) SA has contributed and will continue to contribute to a variety of decision-making problems.
Understand whether the current state of knowledge on input uncertainty is sufficient to enable a decision to be taken
Computational burden is recognized as a major hindrance to the application of SA to cases where SA can be most useful, such as for high-dimensional problems
·sciencedirect.com·
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
¿Por qué tantos análisis de sensibilidad publicados son falsos?
¿Por qué tantos análisis de sensibilidad publicados son falsos?
Sensitivity analysis provides information on the relative importance of model input parameters and assumptions. It is distinct from uncertainty analys…
Many highly-cited papers (42% in the present analysis) present a SA of poor quality.
It is therefore essential to understand the impact of these uncertainties on the model output, if the model is to be used effectively and responsibly in any decision-making process.
sensitivity analysis is “the study of how the uncertainty in the output of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input”
uncertainty analysis (UA), which, as we define it here, characterizes the uncertainty in model prediction, without identifying which assumptions are primarily responsible
type and structure of model, parameters, resolution, calibration data and so forth
Monte Carlo simulation
“design sensitivity analysis” is used as a tool for structural optimisation
While most practitioners of SA distinguish it from UA, modellers overall tend to conflate the two terms, e.g. performing an uncertainty analysis and calling it a sensitivity analysis.
The sensitivity analysis methodology often relies on so-called local techniques which are invalid for nonlinear models.
The greatest density of papers is found in decision science
an even smaller fraction of papers that feature sensitivity analysis adopts a global SA approach.
at least one-third of highly cited papers, matching our search criteria, use deficient OAT methods.
sensitivity analysis is intrinsically attached to modelling, which itself is not a unified subject.
most scientists conflate the meaning of SA and UA.
global sensitivity analysis unavoidably requires a good background in statistics to implement and to interpret results.
the majority of practitioners remain scattered in isolated pockets, and sensitivity analysis is hence not part of a recognized syllabus.
Both uncertainty and sensitivity analysis should be based on a global exploration of the space of input factors, be it using an experimental design, Monte Carlo or other ad-hoc designs.
perform both uncertainty and sensitivity analysis
to focus the sensitivity analysis on the question addressed by the model rather than more generally on the model
If the model is wrong or if it is a poor representation of reality, determining the sensitivity of an individual parameter in the model is a meaningless pursuit
·sciencedirect.com·
¿Por qué tantos análisis de sensibilidad publicados son falsos?
Historias de clientes de MATLAB
Historias de clientes de MATLAB
Diversos casos de uso de MATLAB en empresas, centros de investigación e instituciones educativas.
·mathworks.com·
Historias de clientes de MATLAB
Mejores prácticas para la computación científica
Mejores prácticas para la computación científica
We describe a set of best practices for scientific software development, based on research and experience, that will improve scientists' productivity and the reliability of their software.
Write programs for people, not computers
A program should not require its readers to hold more than a handful of facts in memory at once.
Make names consistent, distinctive, and meaningful.
Make the computer repeat tasks.
Make incremental changes
Modularize code rather than copying and pasting.
Re-use code instead of rewriting it.
Optimize software only after it works correctly
Document design and purpose, not mechanics.
·journals.plos.org·
Mejores prácticas para la computación científica
Historia del MATLAB
Historia del MATLAB
1978. Primera versión en Fortran 1984. Versión en C, PC-MATLAB, Simulink 1985. Control System Toolbox 1988. System Identification Toolbox 1993. Symbolic Math Toolbox 2000. MATLAB Desktop user interface
·dl.acm.org·
Historia del MATLAB
Ingeniería de Sistemas Basada en Modelos
Ingeniería de Sistemas Basada en Modelos
Manage system complexity, improve communication, and produce optimized systems with Model-Based System Engineering.
Los ingenieros utilizan la ingeniería de sistemas basada en modelos (MBSE) para gestionar la complejidad del sistema, mejorar la comunicación y producir sistemas optimizados.
MATLAB, Simulink, System Composer y Requirements Toolbox crean juntos un único entorno para crear modelos de arquitectura descriptivos que se conectan sin problemas con modelos de implementación detallados.
Los ingenieros de sistemas pueden establecer un hilo digital para navegar entre los requisitos del sistema, los modelos de arquitectura, los modelos de implementación y el software integrado.
·mathworks.com·
Ingeniería de Sistemas Basada en Modelos
Estimación del momento de inercia de un cuerpo utilizando estimación de parámetros con MATLAB
Estimación del momento de inercia de un cuerpo utilizando estimación de parámetros con MATLAB
Using the bifilar pendulum as an example, this article shows how you can improve mass moments of inertia estimates by solving a more accurate nonlinear model.
Para cuerpos simples, el momento de inercia de la masa se puede obtener a partir de un modelo CAD o derivar analíticamente.
Para cuerpos más complejos, debe medirse.
El péndulo torsional de eje vertical bifilar (dos alambres de soporte) es un dispositivo simple y rentable para este propósito
Comenzamos creando un modelo de Simulink del péndulo bifilar que implementa la ecuación
Creamos el modelo diseñando un subsistema de bloques de biblioteca para el péndulo bifilar con parámetros de máscara para definir las dimensiones del péndulo y las desviaciones estándar de error estimadas
podemos utilizar Simulink Design Optimization para variar automáticamente los parámetros del objeto de prueba de modo que los datos de simulación coincidan con los datos experimentales registrados.
·mathworks.com·
Estimación del momento de inercia de un cuerpo utilizando estimación de parámetros con MATLAB
Ejemplo de un curso de modelación experimental
Ejemplo de un curso de modelación experimental
An engineering course, Simulation and Experimental Modeling, has been developed that is based on a method for direct estimation of physical parameters in dynamic systems. Compared with classical system identification, the method appears to be easier to understand, apply, and combine with physical insight. It is based on a sensitivity approach that is useful for choice of model structure, for experiment design, and for accuracy verification. The method is implemented in the Matlab toolkit Senstools. The method and the presentation have been developed with generally preferred learning styles in mind. In a comprehensive evaluation of the course, student responses to a course questionnaire and to an Index Of Learning Styles Questionnaire are analyzed and correlated.
curso de ingeniería, Simulación y Modelado Experimental, que se basa en un método para la estimación directa de parámetros físicos en sistemas dinámicos.
Se basa en un enfoque de sensibilidad que es útil para la elección de la estructura del modelo, para el diseño de experimentos y para la verificación de la precisión.
el problema de construir modelos matemáticos de sistemas dinámicos basados en datos observados es generalmente considerado difícil por los estudiantes de ingeniería
Un enfoque fundamental simple se ilustra gráficamente
Se aplican modelos de tiempo continuo con parámetros físicamente significativos
Se puede estimar cualquier parámetro, por ejemplo, parámetros que describen no linealidades y retrasos
Los aspectos estocásticos se reducen al mínimo y se sustituyen por un enfoque de sensibilidad
los méritos del método se refieren a la presentación pedagógica más que a la novedad de la teoría
la señal de entrada debe elegirse de manera que tenga la mayor parte de su potencia en la gama de frecuencias en la que la precisión del modelo es más importante.
Para la identificación de sistemas lineales, una entrada de onda cuadrada suele ser una buena opción.
·ieeexplore.ieee.org·
Ejemplo de un curso de modelación experimental
Historia de cómo llegó el filtro de Kalman al programa Apolo (NASA)
Historia de cómo llegó el filtro de Kalman al programa Apolo (NASA)
"Mi papá invitó a Rudy Kalman a dar una conferencia en Ames, y cuando lo hizo, papá tuvo una epifanía", explicó el joven Schmidt. Kalman había escrito un artículo sobre una solución teórica 'lineal' para estimar la ubicación y la velocidad de un vehículo.
"Mi padre luego desarrolló las ecuaciones para resolver este problema no lineal, una extensión importante del trabajo de Kalman"
las contribuciones de Stanley Schmidt convirtieron una teoría en algo esencial para el éxito de Apolo.
·nasa.gov·
Historia de cómo llegó el filtro de Kalman al programa Apolo (NASA)
Filtro de Kalman: pasado y presente
Filtro de Kalman: pasado y presente
This article is in honor of the 80th birthday of Rudolf Emil Kalman. A brief biography of R.E. Kalman is presented. The most important facts concerned with the creation of the celebrated Kalman filter are briefly outlined. Some trends in the development of applied methods for solution of filtering problems are analyzed. Kalman’s relations and contacts with Russian scientists as well as their contribution to filtering theory and its applications are discussed.
A brief biography of R.E. Kalman is presented
The most important facts concerned with the creation of the celebrated Kalman filter are briefly outlined
·link.springer.com·
Filtro de Kalman: pasado y presente
Descubrimiento del filtro de Kalman como herramienta práctica para la industria aeroespacial (NASA)
Descubrimiento del filtro de Kalman como herramienta práctica para la industria aeroespacial (NASA)
Se relata la secuencia de acontecimientos que llevaron a los investigadores del Centro de Investigación Ames al descubrimiento temprano del filtro de Kalman poco después de su introducción en la literatura.
Se describen los avances científicos y las reformulaciones que fueron necesarias para transformar el trabajo de Kalman en una herramienta útil para una aplicación aeroespacial específica.
·ntrs.nasa.gov·
Descubrimiento del filtro de Kalman como herramienta práctica para la industria aeroespacial (NASA)
Aplicaciones del filtrado de Kalman en la industria aeroespacial desde 1960
Aplicaciones del filtrado de Kalman en la industria aeroespacial desde 1960
In the 1960s, the Kalman filter was applied to navigation for the Apollo Project, which required estimates of the trajectories of manned spacecraft going to the Moon and back. With the lives of the astronauts at stake, it was essential that the Kalman filter be proven effective and reliable before it could be used. This article is about the lead up to Kalman's work, key discoveries in the development and maturation of the filter, a sampling of its many applications in aerospace, and recognition of some who played key roles in that history.
Este artículo trata sobre el período previo al trabajo de Kalman, los descubrimientos clave en el desarrollo y la maduración del filtro, una muestra de sus muchas aplicaciones en la industria aeroespacial y el reconocimiento de algunos que desempeñaron un papel clave en esa historia.
·ieeexplore.ieee.org·
Aplicaciones del filtrado de Kalman en la industria aeroespacial desde 1960