Found 45 bookmarks
Newest
¿Por qué tantos análisis de sensibilidad publicados son falsos?
¿Por qué tantos análisis de sensibilidad publicados son falsos?
Sensitivity analysis provides information on the relative importance of model input parameters and assumptions. It is distinct from uncertainty analys…
Many highly-cited papers (42% in the present analysis) present a SA of poor quality.
It is therefore essential to understand the impact of these uncertainties on the model output, if the model is to be used effectively and responsibly in any decision-making process.
sensitivity analysis is “the study of how the uncertainty in the output of a model (numerical or otherwise) can be apportioned to different sources of uncertainty in the model input”
uncertainty analysis (UA), which, as we define it here, characterizes the uncertainty in model prediction, without identifying which assumptions are primarily responsible
type and structure of model, parameters, resolution, calibration data and so forth
Monte Carlo simulation
“design sensitivity analysis” is used as a tool for structural optimisation
While most practitioners of SA distinguish it from UA, modellers overall tend to conflate the two terms, e.g. performing an uncertainty analysis and calling it a sensitivity analysis.
The sensitivity analysis methodology often relies on so-called local techniques which are invalid for nonlinear models.
The greatest density of papers is found in decision science
an even smaller fraction of papers that feature sensitivity analysis adopts a global SA approach.
at least one-third of highly cited papers, matching our search criteria, use deficient OAT methods.
sensitivity analysis is intrinsically attached to modelling, which itself is not a unified subject.
most scientists conflate the meaning of SA and UA.
global sensitivity analysis unavoidably requires a good background in statistics to implement and to interpret results.
the majority of practitioners remain scattered in isolated pockets, and sensitivity analysis is hence not part of a recognized syllabus.
Both uncertainty and sensitivity analysis should be based on a global exploration of the space of input factors, be it using an experimental design, Monte Carlo or other ad-hoc designs.
perform both uncertainty and sensitivity analysis
to focus the sensitivity analysis on the question addressed by the model rather than more generally on the model
If the model is wrong or if it is a poor representation of reality, determining the sensitivity of an individual parameter in the model is a meaningless pursuit
·sciencedirect.com·
¿Por qué tantos análisis de sensibilidad publicados son falsos?
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling. The tremendous potential benefits of SA are, however, yet…
Sensitivity analysis (SA) is en route to becoming an integral part of mathematical modeling.
Sensitivity analysis (SA), in the most general sense, is the study of how the ‘outputs’ of a ‘system’ are related to, and are influenced by, its ‘inputs’.
‘factors’ in SA, may include model parameters, forcing variables, boundary and initial conditions, choices of model structural configurations, assumptions and constraints.
scientific discovery to explore causalities
dimensionality reduction
decision support
It has roots in ‘design of experiments’ (DOE) which is a broad family of statistical methods
We believe that SA is en route to becoming a mature and independent, but interdisciplinary and enabling, field of science.
opinions on the possible and desirable future evolutions of SA science and practice
The modern era of SA has focused on a notion that is commonly referred to as ‘Global Sensitivity Analysis (GSA)’
Such measures are said to be ‘derivative-based’ as they either analytically compute derivatives or numerically quantify the change in output when factors of interest (continuous or discrete) are perturbed around a point.
The full variance-based SA framework was laid down by Ilya Sobol’ in 1993
One persistent issue in SA is that nearly all applications, regardless of the method used, rest on the assumption that inputs are uncorrelated
ignoring correlation effects and multivariate distributional properties of inputs largely biases, or even falsifies, any SA results
Applications of SA are widespread across many fields, including earth system modeling (Wagener and Pianosi, 2019), engineering (Guo et al., 2016), biomechanics (Becker et al., 2011), water quality modeling (Koo et al., 2020a and 2020b), hydrology (Shin et al., 2013; Haghnegahdar and Razavi, 2017), water security (Puy et al., 2020c), nuclear safety (Saltelli and Tarantola, 2002; Iooss and Marrel, 2019) and epidemiology
it is not a formally recognized discipline
its application in some fields might appear under other titles.
why do I need to run SA for a given problem and what is the underlying question that SA is expected to answer?
how should I design the SA experiment to address that underlying question?
Teach SA more broadly and consistently
a dominant application of SA is for parameter screening, to support model calibration by identifying and fixing non-influential parameters.
Management of uncertainty through its characterization and attribution should be at the heart of the scientific method and, a fortiori, in the use of science for policy
while models are becoming more and more complex, they are treated more and more like a black-box, even by model developers themselves.
SA has significant potential to help in diagnosing the behavior of a mathematical model and for assessing how plausibly the model mimics the system under study for the given application.
To diagnostically test a model, one may compare SA results with expert knowledge on how the underlying system being modeled works.
Most models are poorly-identifiable, largely because of over-parameterization relative to the data and information available
SA and identifiability analysis (IA) are different but complementary
an insensitive parameter is non-identifiable, but the converse is not necessarily true, that is, a sensitive parameter may or may not be identifiable.
Model reduction, however, should be done with caution, as a parameter that seems non-influential under a particular condition might become quite influential under a new condition
fixing parameters that have small sensitivity indices may result in model variations that cannot be explained in the lower dimensional space
Development of research-specific software is at the core of modern modeling efforts.
Computational burden has been a major hindrance to the application of modern SA methods to real-world problems.
The application of SA with machine learning is further complicated because of the fundamental differences between machine learning and other types of models
calls for mutual trust between model developers and end users
The future, therefore, needs new generations of algorithms to keep pace with the ever-increasing complexity and dimensionality of the state-of-the-art models.
A complete assessment of the computational performance of any SA algorithm must be conducted across four aspects: efficiency, convergence, reliability and robustness.
an SA algorithm is robust to sampling variability if its performance remains almost ‘identical’ when applied on two different sample sets taken from the same model.
bootstrapping (Efron, 1987) is often used with SA algorithms to estimate robustness in the form of uncertainty distributions on sensitivity indices without requiring additional model evaluations.
future of SA may step more towards ‘sampling-free’ algorithms that can work on any ‘given data’
More recently, authors have proposed parameter estimation procedures based on nearest neighbors (Broto, 2020), rank statistics (Gamboa et al., 2020) and robustness-based optimization (Sheikholeslami and Razavi, 2020)
Higher dimensionality exacerbates the difficulty of assigning multivariate distributions to uncertain inputs
cases require excessively large sample sizes
sampling strategies
uncertainty estimate, it is notable that a minority of works apply this quantification systematically
Sensitivity analysis of sensitivity analysis
informal (and often local) SA has contributed and will continue to contribute to a variety of decision-making problems.
Understand whether the current state of knowledge on input uncertainty is sufficient to enable a decision to be taken
Computational burden is recognized as a major hindrance to the application of SA to cases where SA can be most useful, such as for high-dimensional problems
·sciencedirect.com·
El futuro del análisis de sensibilidad: una disciplina esencial para el modelado de sistemas y el apoyo a políticas
Modelos: un estado de excepción
Modelos: un estado de excepción
Models live in a state of exception. Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, bet...
Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, better than other cases of quantification, the lenses of sociology and other humanistic disciplines
Models are thus both underexplored and overinterpreted
more actors should engage in practices such as assumption hunting, modelling of the modelling process, and sensitivity analysis and auditing
The state of exception also results from the pretence of neutrality customarily associated with mathematics, and from the asymmetry between developers and users.
‘Funny Numbers’ produced by econometricians to assess the risk of financial instruments
reciprocal domestication between models and society
models are special compared to other families of quantification
Modellers are regarded as endowed with privileged access to the foundations of reality.
Models do not meet classic (Popperian) criteria of scientificity.
a major problem in our use of mathematical models lies in assimilating them to physical laws, and hence treating their predictions with the same dignity
models act as integrators of a broad array of ingredients, including theoretical notions, mathematical concepts and techniques, stylized facts, empirical data, policy views, analogies and metaphors.
models are like metaphors that help us understand how we see the world
consequences of being special
asymmetry between developers and users
models universally known to be wrong continue to play a role in economic policy decisions
Models lend themselves to trans-science. Trans-science refers to scientific practices that appear to be formulated in the language of science, but that science cannot solve because of their sheer complexity or insufficient knowledge
Vulnerability to modelling hubris
models that are too complex (burdened by an excessive number of estimated parameters) may lead to greater imprecision due to error propagation.
Thinking about the reproducibility of models
malpractices such as HARK [Hypothesis After the Results are Known (Kerr Citation 1998)]
In practice, it is safest to treat each run of a model as an experiment.
Sensitivity analysis and sensitivity auditing
Reforming modelling will involve both the adoption of methods in the practice of research and their incorporation into teaching syllabuses.
Uncertainty quantification should be at the heart of the scientific method
there is an evident need for greater clarity on how risk numbers are computed.
The notion that a model can be an avenue of possibly instrumental ‘displacement’, in the sense of moving the attention from the system to its model, as discussed in Rayner ( Citation 2012), is still too radical for many practitioners to contemplate.
preregistration of modelling studies (Ioannidis Citation 2022) is still a long way off
·tandfonline.com·
Modelos: un estado de excepción
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Pandemic politics highlight how predictions need to be transparent and humble to invite insight, not blame.
predictions need to be transparent and humble to invite insight, not blame.
The COVID-19 pandemic illustrates perfectly how the operation of science changes when questions of urgency, stakes, values and uncertainty collide — in the ‘post-normal’ regime.
decision makers and citizens need to establish new social norms
Here we present a manifesto for best practices for responsible mathematical modelling
Mind the assumptions Assess uncertainty and sensitivity.
One way to mitigate these issues is to perform global uncertainty and sensitivity analyses.
Mind the hubris Complexity can be the enemy of relevance.
there is a trade-off between the usefulness of a model and the breadth it tries to capture
As more parameters are added, the uncertainty builds up (the uncertainty cascade effect)
The complexity of a model is not always an indicator of how well it captures the important features.
the time needed for water to percolate down to the underground repository level — was uncertain by three orders of magnitude, rendering the size of the model irrelevant
no one can be held accountable if the predictions are catastrophically wrong
Mind the framing Match purpose and context.
No one model can serve all purposes.
all presuppose a set of values about what matters
how to produce a model, assess its uncertainty and communicate the results.
Whenever a model is used for a new application with fresh stakeholders, it must be validated and verified anew.
Mind the consequences Quantification can backfire.
Undiscriminating use of statistical tests can substitute for sound judgement.
Spurious precision adds to a false sense of certainty.
some might imagine a confidence of two significant digits
trust is essential for numbers to be useful
Mind the unknowns Acknowledge ignorance.
models can hide ignorance
Experts should have the courage to respond that “there is no number-answer to your question”
Questions not answers Mathematical models are a great way to explore questions.
Asking models for certainty or consensus is more a sign of the difficulties in making controversial decisions than it is a solution
good modelling cannot be done by modellers alone. It is a social activity.
Following these five points will help to preserve mathematical modelling as a valuable tool.
·nature.com·
Cinco formas de garantizar que los modelos sirvan a la sociedad: un manifiesto
Una cronología del análisis de sensibilidad
Una cronología del análisis de sensibilidad
The last half a century has seen spectacular progresses in computing and modelling in a variety of fields, applications, and methodologies. Over the s…
evolution from local to global methods
future directions and areas of growth in the field
design of experiments
While uncertainty analysis studies the uncertainty in the output, sensitivity analysis studies how the uncertainty in the output can be allocated to the different sources of uncertainty in the input
Sensitivity analysis serves various purposes, including model validation, dimensionality reduction, prioritization of research efforts, pinpointing critical regions within the space of uncertainties under investigation, and aiding decision-making by quantifying how input variations impact outcome uncertainty
independent discipline recognised also by institutional guidelines (European Commission
This “global” understanding started in the 1970s
the sensitivity analysis panorama is still dominated in practically all disciplines by the so-called "local" approaches
tension still characterizing different schools of sensitivity analysis
a good date to set the start of sensitivity analysis is 1905, when Karl Pearson, the founder of modern statistics, proposed the idea of correlation ratio (known as the η2 index)
formalization of the experimental design in the 1920s and 1930s by the statistician Ronald Fisher.
process whereby statistics managed to adjudicate the authority to assess the ‘realism of causes’
World War 2 provided a significant impetus for the expansion and application of sensitivity analysis within the field of operational research
Experimental design continued to develop with several important advancements in this field in the 1950s, including the widespread adoption of factorial designs
Fourier amplitude sensitivity test (FAST) in the early 1970s
At the early stages of the 80's, a notable contribution was made by Efron and Stein (1981) on the variance decomposition into terms of increasing dimensionality
One of the key breakthroughs during this time was the adoption of random sampling techniques
In 1993, Sobol’ introduced an innovative approach to sensitivity analysis based on the decomposition of the output variance
Towards the end of the 1990s, a brand-new community of sensitivity-analysis practitioners emerged, reflecting on the concept of “global sensitivity analysis”
introduced the innovative winding stairs method for computing higher-order effects
emphasizing its application in various settings like factor prioritization, factor fixing, and variance reduction, stressing the need for global methods in order to treat non-linear and non-additive models
A first international conference on global sensitivity analysis (SAMO) was organized in 1995
local sensitivity analysis methods remained prevalent across disciplines
SA practitioners also started to compare the performance of sensitivity analysis methods using SA itself
researchers focused on developing methods for sensitivity analysis of computationally expensive models
The trend toward model complexification emphasized the importance of using sensitivity analysis to ensure accurate and reliable model outputs.
The COVID-19 pandemic was partly instrumental with this development, leading several authors to question the political use of models
Sensitivity analysis and auditing have recently been proposed as tools to jointly match the double demand for technical and normative quality in modelling
the penetration of global sensitivity analysis methods into the broader modelling community has not reached its full potential
Notably absent fields include finance and economics, and, to a lesser extent, medicine and related fields such as psychology and neuroscience
hypothesis testing (e.g Dunnett test), which are capable of answering similar questions.
If sensitivity analysis has not been widely adopted or promoted within a particular discipline, researchers might be less inclined to explore its potential benefits
there is a growing demand for user-friendly tools
In the future, sensitivity analysis is expected to play a pivotal role in guiding model development and decision-making processes, especially as simulation models become increasingly bigger and more complex
·sciencedirect.com·
Una cronología del análisis de sensibilidad
Ciencia en la era posnormal
Ciencia en la era posnormal
En mi opinión, la ciencia posnormal presta más atención a la influencia de las personas en forma de una incertidumbre en un tipo especial de entradas: las perturbaciones (influencia, ideología, ética, política, intereses).
Post-Normal Science (PNS)
facts uncertain, values in dispute, stakes high and decisions urgent
Los factores son inciertos, hay valores en disputa, los riesgos son altos y las decisiones urgentes
it really applies to politics, and not to science
correctly defining the characteristics of both the world, and the relevant science
Post-Normal Science
https://en.wikipedia.org/wiki/Post-normal_science
there are essential social and political dimensions of the problem-solving practice
We use the two attributes of systems uncertainties and decision stakes to distinguish among these.
traditional methodologies are ineffective
The reductionist, analytical worldview which divides systems into ever smaller elements, studied by ever more esoteric specialism, is being replaced by a systemic, synthetic and humanistic approach
In this ‘normal’ state of science, uncertainties are managed automatically, values are unspoken, and foundational problems unheard of.
uncertainty is not banished but is managed, and values are not presupposed but are made explicit
quality of scientific information
problem-solving strategies, analysed in terms of uncertainties in knowledge and complexities in ethics.
uncertainty cannot be banished from science; but that good quality of information depends on good management of its uncertainties
applied science is ‘mission-oriented’; professional consultancy is ‘client-serving’; and post-normal science is ‘issue-driven’
issue-driven
basado en problemas
their resolution requires new conceptions of scientific methodology
different kinds of uncertainty can be expressed, and used for an evaluation of quality of scientific information
three zones representing and characterizing three kinds of problem-solving strategies
the research exercise is generally not undertaken unless there is confidence that the uncertainties are low, that is that the problem is likely to be soluble by a normal, puzzle-solving approach
professional consultancy includes applied science
uncertainty is at the methodological level
will be in conflict, involving various human stakeholders and natural systems as well
Of engineering we could say that most routine engineering practice is a matter of empirical craft skills using the results of applied science, while at its highest levels it becomes true professional consultancy.
The outcomes of applied science exercises, like those of core science, have the features of reproducibility and prediction.
professional tasks deal with unique situations, however broadly similar they may be.
it would be unrealistic to expect two safety engineers to produce the same model (or the same conclusions) for a hazard analysis of a complex installation.
these policy issues involve professional consultancy, such disagreements should be seen as inevitable and healthy.
We can envisage four components in the problem-solving task; the process, the product, the person and the purpose
Persona ↓ ---------> Proceso ---------> Purpose Product
In professional consultancy there can be no simple, objective criteria or processes for quality assurance (beyond simple competence)
post-normal science occurs when uncertainties are either of the epistemological or the ethical kind, or when decision stakes reflect conflicting purposes among stakeholders
post-normal science is indeed a type of science, and not merely politics or public participation
Because of the high level of uncertainty, approaching sheer ignorance in some cases
The uncertainties go beyond those of the systems, to include ethics as well.
In post-normal science, the manifold uncertainties in both products and processes require that the relative importance of persons becomes enhanced.
It is important to appreciate that post-normal science is complementary to applied science and professional consultancy. It is not a replacement for traditional forms of science
·commonplace.knowledgefutures.org·
Ciencia en la era posnormal
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Without validation data, more complex models may succumb to uncertainty.
modelers can better ponder whether the addition of detail truly matches the model’s purpose
transmission model of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), based on more than 900 parameters
the development of finer-grained models often proceeds without having, at the scale required, specific data available to train or validate the model
modelers cannot benefit from existing statistical instruments that help balance model complexity with error so as to align with science’s quest for parsimony, such as Akaike’s (13) or Schwarz’s (14) information criterion
modelers can gauge the connection between model complexity and uncertainty at all stages of model development by calculating the model’s “effective dimensions,” that is, the number of influential parameters and active higher-order effects
effective dimensions
the addition of model detail in process-based models tends to produce more (and not less) uncertain estimates because it increases the model’s effective dimensions, which generally boost the output variance
statistical theory behind the connection between model uncertainty, complexity, and the notion of effective dimensions
how the concept of effective dimensions can help modelers balance model complexity with uncertainty
number of parameters and the pattern of their connections as key contributors to the complexity of mathematical models
aggregate complexity
The effective dimension ks is therefore the order of the highest effect that needs to be included in Eq. 1 to reach p
more complex models will generally display a higher effective dimension in kt and ks, an increase that promotes a larger output uncertainty
a model with more parameters does not necessarily lead to a larger uncertainty if n ≠ k
classic susceptible-infected-recovered (SIR) model is often enhanced with the addition of fine-grained features such as seasonality or age stratification
If the goal of the model is to gain insights into the effects that vaccination and nonpharmaceutical interventions may have on the spread of the virus, then the SIR(S-V) might be preferred over the more complex SIR(S-E) because the extra detail in the latter does not help clarify potential courses of action
importance of stripping mathematical models of superfluous parameters, processes, or linkages
Ti includes all terms in Eq. 3 with the index i
·science.org·
Los modelos con dimensiones efectivas más altas tienden a producir estimaciones más inciertas
Rigidez de las ecuaciones diferenciales ordinarias
Rigidez de las ecuaciones diferenciales ordinarias
Stiffness is a subtle concept that plays an important role in assessing the effectiveness of numerical methods for ordinary differential equations. (This article is adapted from section 7.9,
A problem is stiff if the solution being sought varies slowly, but there are nearby solutions that vary rapidly, so the numerical method must take small steps to obtain satisfactory results.
Nonstiff methods can solve stiff problems; they just take a long time to do it.
For truly stiff problems, a stiff solver can be orders of magnitude more efficient, while still achieving a given accuracy
You don't want to change the differential equation or the initial conditions, so you have to change the numerical method.
·blogs.mathworks.com·
Rigidez de las ecuaciones diferenciales ordinarias
¿Por qué necesitamos solucionadores de EDO rígidos?
¿Por qué necesitamos solucionadores de EDO rígidos?
I hope you have noticed the new MATLAB Central blog: Cleve's Corner. Cleve once said
the output is flat most of the time, but shows a very fast sudden transition. This is typical of stiff systems.
compare the results between ode45 and ode23s
ode45 is not able to recover after the fast transition and keeps taking very small steps. In this specific example, ode45 required 3046 steps to solve the problem, while ode23s required only 91 steps! Ode23s may do more computations per step, but it can take fewer steps than ode45.
·blogs.mathworks.com·
¿Por qué necesitamos solucionadores de EDO rígidos?
Un Gömböc
Un Gömböc
tiene un solo punto de equilibrio estable y otro inestable
·es.wikipedia.org·
Un Gömböc
Cuando todos los modelos están equivocados
Cuando todos los modelos están equivocados
More stringent quality criteria are needed for models used at the science/policy interface. Here is a checklist to aid in the responsible development and use of models.
tal escepticismo parece cada vez más común y cada vez más independiente de la posición ideológica
¿el tono de este tipo de ataques refleja un colapso de la confianza en la empresa científica y en su papel social e institucional?
la sociedad puede estar menos dispuesta a aceptar tales afirmaciones que en el pasado
los peligros para la ciencia se hacen más evidentes cuando los modelos —resúmenes de problemas más complejos del mundo real, generalmente expresados en términos matemáticos— se utilizan como herramientas de política.
serie particularmente accesible de historias de terror sobre el mal uso de los modelos y el consiguiente fracaso de las políticas
la precisión y el valor de las predicciones en sí mismas terminan estando en el centro de los debates políticos y distraen de la necesidad y la capacidad de abordar el problema a pesar de las incertidumbres actuales.
"todos los modelos son erróneos, pero algunos son útiles"
se deben adoptar criterios estrictos de transparencia cuando los modelos se utilizan como base para las evaluaciones de políticas
cuanto más se entiende el clima, más inciertas se vuelven las predicciones de los modelos de futuros climáticos específicos
Muchas pequeñas incertidumbres multiplicadas juntas producen enormes incertidumbres agregadas.
El desafío se vuelve aún más desalentador cuando los modeladores centran su atención en las consecuencias económicas de los cambios en la composición atmosférica.
Semejante esfuerzo está tan alejado de la capacidad predictiva actual que raya en lo irresponsable.
la mayoría de los aspectos de la mayoría de los modelos no estarán sujetos a un escrutinio tan minucioso
se basan en lo que los modelos pronostican sobre el futuro, con poca o ninguna sensibilidad a los límites de lo que los modelos son realmente capaces de pronosticar con precisión.
proponemos siete reglas que, en su conjunto, forman una lista de verificación para el desarrollo y uso responsable de los modelos.
análisis de sensibilidad global
El análisis de sensibilidad global evalúa la importancia relativa de los factores de entrada en términos del impacto de su incertidumbre en la salida del modelo.
¿Cuál de esas incertidumbres tiene el mayor impacto en el resultado?
"auditoría de sensibilidad"
La "auditoría" hace hincapié en la idea de la rendición de cuentas a un público más amplio (en este caso, los responsables de la formulación de políticas y el público) y, por lo tanto, exige que el modelo sea accesible y transparente y que la experiencia no se defina de manera estrecha para excluir a todos excepto a quienes crearon el modelo
La auditoría de sensibilidad no tiene como objetivo mejorar el modelo; Más bien, al igual que una auditoría fiscal, se produce al final del proceso, en el momento en que el modelo se convierte en una herramienta para la evaluación de políticas, cuando todos los desarrolladores han llevado a cabo toda la calibración posible del modelo, la optimización, la asimilación de datos y similares utilizando las herramientas de su oficio.
Regla 1: Usa los modelos para aclarar, no para oscurecer.
La Regla 1 prescribe que se deben plantear preguntas sobre quién se beneficia del modelo y qué motivaciones e incentivos animan a los desarrolladores de modelos.
Regla 2: Adopta una actitud de "caza de suposiciones"
Se requieren nuevos procesos para el desarrollo y uso de modelos, en los que las partes interesadas comprometidas trabajen con expertos disciplinarios para desarrollar nuevos modelos que puedan utilizarse para probar diversas opciones de política
exige que los modelos utilizados para guiar las decisiones regulatorias se pongan a disposición de un tercero para permitir la evaluación del impacto de cambiar los supuestos o parámetros de entrada en la conclusión basada en el modelo.
Regla 3: Detectar la pseudociencia.
definimos la pseudociencia como la práctica de ignorar u ocultar las incertidumbres en las entradas del modelo para garantizar que las salidas del modelo se puedan vincular a las opciones de política preferidas.
Un indicador común de este tipo de pseudociencia es la precisión espuria, por ejemplo, cuando se da un resultado con un número de dígitos que excede (a veces ridículamente) cualquier estimación plausible de la precisión asociada.
C. F. Gauss, "la falta de cultura matemática no se revela en ninguna parte de manera tan conspicua como en la precisión sin sentido en los cálculos numéricos"
No publique sus inferencias sin haber realizado una cuidadosa auditoría de sensibilidad.
Regla 4: Encuentra suposiciones sensibles antes de que te encuentren.
"Confesarás en presencia de la sensibilidad. Corolario: Anticiparás la crítica"
Cualquier inferencia basada en modelos que se introduzca en el entorno político sin ir acompañada de un análisis de sensibilidad técnicamente sólido debe considerarse sospechosa.
<b _istranslated="1">Regla 5: Apunta a la transparencia.</b> Las partes interesadas deben ser capaces de dar sentido y, si es posible, replicar los resultados del análisis.
las representaciones de modelos simples o parsimoniosos son mejores que los modelos más "sofisticados" o complejos, cuando se utilizan para evaluaciones de impacto de políticas.
<b _istranslated="1">Regla 6: No te limites a "hacer bien las sumas", sino "hacer las sumas correctas".</b> Cuando se descuidan los puntos de vista relevantes de las partes interesadas, los modeladores pueden centrarse o abordar las incertidumbres equivocadas.
<font _mstmutation="1" _msttexthash="5643729" _msthash="402">un error de tipo uno es un falso positivo: se determina que una práctica es insegura cuando es segura</font><span class="diigoHighlightCommentLocator"></span>, o que una intervención no es beneficiosa cuando es beneficiosa. <font _mstmutation="1" _msttexthash="1845259" _msthash="404">Un error de tipo dos es lo contrario: un falso negativo.</font><span class="diigoHighlightCommentLocator"></span> <font _mstmutation="1" _msttexthash="14838902" _msthash="406">Un error de tipo tres, por el contrario, es aquel en el que el análisis en sí mismo se enmarca incorrectamente y, por lo tanto, el problema se caracteriza erróneamente.</font>
es fácil caer en lo que podemos llamar "lamp-posting", en el que las incertidumbres o parámetros que se examinan con más detenimiento son los menos relevantes pero más fáciles de analizar
<b _istranslated="1">Regla 7: Enfoca el análisis.</b> No haga análisis de sensibilidad superficiales, simplemente cambiando un factor a la vez.
Por ejemplo, en un sistema con 10 factores inciertos, si se mueve solo uno a la vez, se corre el riesgo de explorar solo una pequeña parte de la incertidumbre de entrada potencial total.
la elección de una línea de base es en sí misma un proceso cargado de suposiciones y, por lo tanto, está sujeta a críticas y análisis de sensibilidad.
Una auditoría de sensibilidad creíble no debe estar anclada a bases de referencia que son en sí mismas subjetivas; Deben evaluar el efecto de cualquier entrada, mientras que todas las demás entradas también pueden variar.
Nuestra opinión es que las prácticas actuales de modelización, en su desarrollo y uso, son una amenaza significativa para la legitimidad y la utilidad de la ciencia en entornos políticos controvertidos.
·issues.org·
Cuando todos los modelos están equivocados
Usos y abusos de las matemáticas en biología
Usos y abusos de las matemáticas en biología
  • Las matemáticas han sido menos intrusivas en las ciencias de la vida, posiblemente porque hasta hace poco tiempo han sido en gran medida descriptivas, carentes de los principios de invariancia y las constantes naturales fundamentales de la física. - Modelos de evolución, ecología y epidemiología - Aplicaciones de las matemáticas en la secuenciación del genoma humano y de otros genomas. - La virtud de las matemáticas es que impone claridad y precisión a las conjeturas, lo que permite una comparación significativa entre las consecuencias de los supuestos básicos y lo empírico. - Los modelos matemáticos han demostrado tener muchos usos y adoptar muchas formas en las ciencias de la vida. - Sin embargo, las matemáticas no tienen la relación duradera con las ciencias de la vida que tienen con las ciencias físicas y la ingeniería. - Los abusos más comunes, y no siempre fáciles de reconocer, son las situaciones en las que los modelos matemáticos se construyen con una abundancia insoportable de detalles en algunos aspectos, mientras que otras facetas importantes del problema son confusas o un parámetro vital es incierto dentro de un orden de magnitud.
·researchgate.net·
Usos y abusos de las matemáticas en biología
Simulación de la propagación de una enfermedad en diferentes escenarios
Simulación de la propagación de una enfermedad en diferentes escenarios
El goteo inicial de nuevas infecciones de coronavirus se ha transformado en una corriente continua. Podemos aprender cómo frenarla a través de sencillas simulaciones.
el distanciamiento social moderado normalmente funciona mejor que el intento de cuarentena
el distanciamiento social exhaustivo suele funcionar mejor que cualquier otro
·washingtonpost.com·
Simulación de la propagación de una enfermedad en diferentes escenarios
Analogía electromecánica
Analogía electromecánica
Procedimiento para modelizar sistemas eléctricos mediante la combinación de elementos mecánicos y al revés
Las analogías electromecánicas se utilizan para modelizar el funcionamiento de un sistema mecánico mediante un sistema eléctrico equivalente, estableciendo analogías entre parámetros mecánicos y eléctricos.
Una analogía electromecánica consiste en la representación de un sistema mecánico mediante un circuito eléctrico.
Este enfoque es especialmente útil en el diseño de filtros mecánicos, ya que se utilizan simples dispositivos eléctricos para emular sistemas mecánicos mucho más caros y complejos.
·es.wikipedia.org·
Analogía electromecánica
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
  • Mathematically, it is not an ordinary function - It can informally be interpreted as the derivative of the Heaviside unit step function - Can be defined as a generalized function or a distribution or a function - Dirac’s 1920s original mathematical intuition and his systematic use of the Dirac delta function in quantum mechanics became an effective and useful mathematical tool in mathematical physics and applied mathematics and led him to state strongly that it must be true due to its clarity and elegance. In order to provide a proper mathematical justification of the Dirac delta function, S. L. Sobolev (1908–1989), a famous Russian mathematician, first introduced the idea of generalized functions in 1936. About 14 years later, it became a whole new discipline of theory of distributions. - Rectangular (or Dirac) delta sequence, (b) the Gaussian δ-sequence,
·tandfonline.com·
Una breve biografía de Paul A. M. Dirac y el desarrollo histórico de la función delta de Dirac
Modelos en la ciencia
Modelos en la ciencia
Models are of central importance in many scientific contexts.
Scientists spend significant amounts of time building, testing, comparing, and revising models
Phenomenological models have been defined in different, although related, ways. A common definition takes them to be models that only represent observable properties of their targets and refrain from postulating hidden mechanisms and the like
Exploratory models are models which are not proposed in the first place to learn something about a specific target system or a particular experimentally established phenomenon.
A model of data (sometimes also “data model”) is a corrected, rectified, regimented, and in many instances idealized version of the data we gain from immediate observation
Some models are physical objects. Such models are commonly referred to as “material models”
If a model is constructed in a domain where no theory is available, then the model is sometimes referred to as a “substitute model”
Models as a means to explore theory
Models as complements of theories
Models as preliminary theories
Models as mediators
·plato.stanford.edu·
Modelos en la ciencia
Simulaciones por computador en la ciencia
Simulaciones por computador en la ciencia
The list of sciences that make extensive use of computer simulation has grown to include astrophysics, particle physics, materials science, engineering, fluid mechanics, climate science, evolutionary biology, ecology, economics, decision theory, medicine, sociology, epidemiology, and many others.
In its narrowest sense, a computer simulation is a program that is run on a computer and that uses step-by-step methods to explore the approximate behavior of a mathematical model. Usually this is a model of a real-world system (although the system in question might be an imaginary or hypothetical one)
More broadly, we can think of computer simulation as a comprehensive method for studying systems.
Equation-based Simulations
Agent-based simulations
But some simulation models are hybrids of different kinds of modeling methods. Multiscale simulation models, in particular, couple together modeling elements from different scales of description.
Monte Carlo Simulations
MC simulations are computer algorithms that use randomness to calculate the properties of a mathematical model and where the randomness of the algorithm is not a feature of the target model.
Purposes of Simulation
What warrants our taking a computer simulation to be a severe test of some hypothesis about the natural world?
Verification can be divided into solution verification and code verification.
The principal strategy of validation involves comparing model output with observable data.
A simulation that accurately mimics a complex phenomenon contains a wealth of information about that phenomenon.
Philosophers, consequently, have begun to consider in what sense, if any, computer simulations are like experiments and in what sense they differ.
computer simulations have profound implications for our understanding of the structure of theories
The identity thesis. Computer simulation studies are literally instances of experiments.
·plato.stanford.edu·
Simulaciones por computador en la ciencia
Ingeniería de Sistemas Basada en Modelos
Ingeniería de Sistemas Basada en Modelos
Manage system complexity, improve communication, and produce optimized systems with Model-Based System Engineering.
Los ingenieros utilizan la ingeniería de sistemas basada en modelos (MBSE) para gestionar la complejidad del sistema, mejorar la comunicación y producir sistemas optimizados.
MATLAB, Simulink, System Composer y Requirements Toolbox crean juntos un único entorno para crear modelos de arquitectura descriptivos que se conectan sin problemas con modelos de implementación detallados.
Los ingenieros de sistemas pueden establecer un hilo digital para navegar entre los requisitos del sistema, los modelos de arquitectura, los modelos de implementación y el software integrado.
·mathworks.com·
Ingeniería de Sistemas Basada en Modelos
Sobre los ceros de fase no mínima
Sobre los ceros de fase no mínima
The purpose of this article is to illuminate the critical role of system zeros in control-system performance for the benefit of a wide audience both inside and outside the control systems community. Zeros are a fundamental aspect of systems and control theory; however, the causes and effects of zeros are more subtle than those of poles. In particular, positive zeros can cause initial undershoot (initial error growth), zero crossings, and overshoot in the step response of a system, whereas nonminimum-phase zeros limit bandwidth. Both of these aspects have real-world implications in many applications. Nonminimum-phase zeros exacerbate the tradeoff between the robustness and achievable performance of a feedback control system. From a control-theoretic point of view, a nonminimum-phase zero in the loop transfer function L is arguably the worst feature a system can possess. Every feedback synthesis methodology must accept limitations due to the presence of open-right-half-plane zeros, and the mark of a good analysis tool is the ability to capture the performance limitations arising from nonminimum-phase zeros.
papel crítico de los ceros del sistema en el rendimiento del sistema de control
las causas y efectos de los ceros son más sutiles que los de los polos
Los ceros son un aspecto fundamental de la teoría de sistemas y control
los ceros positivos pueden causar un subimpulso inicial (crecimiento inicial del error)
los ceros de fase no mínima limitan el ancho de banda
Los ceros de fase no mínimos exacerban el equilibrio entre la solidez y el rendimiento alcanzable de un sistema de control de retroalimentación
un cero de fase no mínimo en la función de transferencia de bucle L es posiblemente la peor característica que puede poseer un sistema.
aceptar limitaciones debidas a la presencia de ceros de semiplano derecho abiertos
·ieeexplore.ieee.org·
Sobre los ceros de fase no mínima
Verificación y validación de los modelos de simulación por ordenador
Verificación y validación de los modelos de simulación por ordenador
Los modelos de simulación son imitaciones aproximadas de los sistemas del mundo real y nunca imitan exactamente el sistema del mundo real
la verificación de un modelo es el proceso de confirmar que se implementa correctamente con respecto al modelo conceptual (coincide con las especificaciones y los supuestos que se consideran aceptables para el propósito de la aplicación)
La validación verifica la precisión de la representación del modelo del sistema real.
·es.wikipedia.org·
Verificación y validación de los modelos de simulación por ordenador
La matemática que requiere un ingeniero de control
La matemática que requiere un ingeniero de control
Reader Aziz Khan asks: "Is there a list of mathematical subjects that you think Ph.D. students in control should have a solid grip on so that they can understand and contribute to the control literature?"
se emplea una amplia y diversa gama de técnicas matemáticas
el análisis estructurado de valores singulares es eficaz para las ecuaciones diferenciales, la teoría de matrices, la optimización, la teoría de sistemas dinámicos, la estadística multivariante y los procesos estocásticos.
Algunas de las oportunidades más prometedoras, a juzgar por los niveles de financiación y el clima político y la retórica actuales, se encuentran en las áreas de biología, energía, salud, medio ambiente y redes sociales.
·ieeexplore.ieee.org·
La matemática que requiere un ingeniero de control
Teoría del caos en el movimiento de los planetas y el clima
Teoría del caos en el movimiento de los planetas y el clima
El universo parece ordenado, pero el caos es la fuerza oculta que gobierna el movimiento de los planetas y nuestras vidas.
el caos es la fuerza oculta que gobierna el movimiento de los planetas y nuestras vidas.
El caos, en términos científicos, no es sinónimo de desorden total, sino una condición en la que pequeños cambios en las condiciones iniciales pueden producir resultados completamente diferentes.
Poincaré demostró que cuando n = 3 o más, no puede existir fórmula alguna
no importa lo compleja que sea la fórmula que construyamos para describir el movimiento de los tres cuerpos a lo largo de un periodo de tiempo finito, la fórmula acabará fallando si el movimiento se amplía a periodos de tiempo más largos.
las órbitas de los tres cuerpos nunca se repiten
ninguno de los miembros del conjunto la Tierra es expulsada del sistema solar en los próximos miles de millones de años
Una cosa es el modelo con comportamiento caótico y otra cosa es el sistema que describe. No podremos llegar a predecir a largo plazo un estado en un sistema caótico, pero ¿puede ser realmente ese comportamiento único? ¿Puede realmente el aleteo de una mariposa cambiar el clima? Creo que no.
La intuición de Lorenz era que las ecuaciones meteorológicas no admitían ese comportamiento periódico.
Lorenz siguió simplificando y&nbsp;simplificando hasta que ideó un modelo matemático del movimiento de los fluidos basado en solo tres ecuaciones para tres variables, convencionalmente etiquetadas como X, Y y Z
El modelo de Lorenz está tan idealizado que ya no tiene mucho sentido relacionar estas tres variables con cantidades concretas
·muyinteresante.com·
Teoría del caos en el movimiento de los planetas y el clima
Bifurcación
Bifurcación
area of mathematics
Una bifurcación local ocurre cuando un cambio de parámetro hace que la estabilidad de un equilibrio (o punto fijo) cambie.
Las bifurcaciones globales ocurren cuando conjuntos invariantes "más grandes", como órbitas periódicas, chocan con equilibrios.
La teoría de la bifurcación se ha aplicado para conectar los sistemas cuánticos con la dinámica de sus análogos clásicos en sistemas atómicos
a bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden 'qualitative' or topological change in its behavior
The name "bifurcation" was first introduced by Henri Poincaré in 1885 in the first paper in mathematics showing such a behavior
·en.wikipedia.org·
Bifurcación
Ciclo límite
Ciclo límite
closed trajectory in a 2d phase space of a dynamical system such that that another trajectory spirals into it as time approaches ±∞
Encontrar ciclos límite, en general, es un problema muy difícil.
Stable, unstable and semi-stable limit cycles
a limit cycle is a closed trajectory in phase space having the property that at least one other trajectory spirals into it either as time approaches infinity or as time approaches negative infinity
The study of limit cycles was initiated by Henri Poincaré (1854–1912)
The Bendixson–Dulac theorem and the Poincaré–Bendixson theorem predict the absence or existence, respectively, of limit cycles of two-dimensional nonlinear dynamical systems
·en.wikipedia.org·
Ciclo límite
Teoría del caos
Teoría del caos
área de las matemáticas
No hay una definición universal sobre el caos, pero hay tres ingredientes en los que todos los científicos están de acuerdo: Movimiento oscilante. Las trayectorias no se ajustan a un punto fijo, órbita periódica u órbita cuasiperiódica cuando el tiempo tiende a infinito. Determinismo. El sistema no es azaroso sino determinista. El comportamiento irregular, en dimensión finita, surge de la no linealidad. Por eso se define como determinista. Sensibilidad a las condiciones.
Los sistemas caóticos típicamente se caracterizan por ser modelizables mediante un sistema dinámico que posee un atractor.
Dentro de los atractores se define como atractor extraño o caótico al atractor que exhibe dependencia sensible con las condiciones iniciales.
Los atractores exhiben una dependencia sensible de las condiciones iniciales.
el sistema es atraído hacia un tipo de movimiento, es decir, que hay un atractor.
Atractor caótico: Aparece en sistemas no lineales que tienen una gran sensibilidad a las condiciones. Un famoso ejemplo de estos atractores es el atractor de Lorenz.
El teorema de Poincaré-Bendixson muestra que un atractor extraño solo puede presentarse como un sistema continuo dinámico si tiene tres o más dimensiones. Sin embargo, tal restricción no se aplica a los sistemas discretos, los cuales pueden exhibir atractores extraños en dos o incluso una dimensión.
El primer sistema de ecuaciones bien caracterizado que exhibía comportamiento caótico fue el sistema de ecuaciones propuesto por Lorenz
Es posible que los modelos económicos también puedan mejorarse mediante la aplicación de la teoría del caos
·es.wikipedia.org·
Teoría del caos
Aliasing (enmascaramiento)
Aliasing (enmascaramiento)
Cuando esto sucede, la señal original no puede ser reconstruida de forma unívoca a partir de la señal digital.
cada una de las sinusoides se convierte en un "alias" para la otra.
el aliasing, o solapamiento, es el efecto que causa que señales continuas distintas se tornen indistinguibles cuando se muestrean digitalmente.
Cuando se obtienen muestras periódicas de una señal senoidal, puede ocurrir que se obtengan las mismas muestras que se obtendrían de una señal sinusoidal igualmente pero con frecuencia más baja
El teorema de Nyquist indica que la frecuencia de muestreo mínima que tenemos que utilizar debe ser mayor que 2·fmax, donde fmax es la frecuencia máxima de la señal
·es.wikipedia.org·
Aliasing (enmascaramiento)
Clasificación de los puntos de equilibrio en dos y tres dimensiones
Clasificación de los puntos de equilibrio en dos y tres dimensiones
An equilibrium (or equilibrium point) of a dynamical system generated by an autonomous system of ordinary differential equations (ODEs) is a solution that does not change with tim
The equilibrium is said to be hyperbolic if all eigenvalues of the Jacobian matrix have non-zero real parts.
Node
Saddle
Focus
·scholarpedia.org·
Clasificación de los puntos de equilibrio en dos y tres dimensiones
Explicación detallada del movimiento oscilatorio
Explicación detallada del movimiento oscilatorio
Scond-order linear differential equations are used to model many situations in physics and engineering. Here, we look at how this works for systems of an object with mass attached to a vertical spring and an electric circuit containing a resistor, an inductor, and a capacitor connected in series. Models such as these can be used to approximate other more complicated situations; e.g., bonds between atoms or molecules are often modeled as springs that vibrate.
Simple Harmonic Motion
According to Hooke’s law, the restoring force of the spring is proportional to the displacement and acts in the opposite direction from the displacement
Damped Vibrations
Forced Vibrations
·math.libretexts.org·
Explicación detallada del movimiento oscilatorio