For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. This abstract situation applies, for example, to environmental risks with infrastructure controls to supply chain risks with inventory controls and to insurance solvency risks with capital controls. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. © 2015 Society for Risk Analysis.įailure probability under parameter uncertainty. Numerical illustrations are also presented. We illustrate how these probabilistic models and results can be successfully applied in several important areas of risk analysis, among which are systems reliability, inventory management, flood control via dam management, infectious disease spread, and financial insolvency. Under general assumptions, we define two dually connected models for the risk process and derive explicit expressions for the failure probability and also the joint probability of the time of the occurrence of failure and the excess of the risk process over the risk level. The latter is defined as the probability that a certain risk process, characterizing the operations of a system, reaches a possibly time-dependent critical risk level within a finite-time interval. In this article, we introduce a framework for analyzing the risk of systems failure based on estimating the failure probability. Modeling Finite-Time Failure Probabilities in Risk Analysis Applications.ĭimitrova, Dimitrina S Kaishev, Vladimir K Zhao, Shouqi Then differentiated services can be achieved in optical grid. New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. This paper will propose new multi-objective differentiated services algorithm (MDSA). In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. This paper will present a task-based analysis method of the application failure probability in optical grid. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. Zhong, Yaoquan Guo, Wei Sun, Weiqiang Jin, Yaohui Hu, Weisheng Failure probability analysis of optical grid
0 Comments
Leave a Reply. |