PLOS ONE: [sortOrder=DATE_NEWEST_FIRST, sort=Date, newest first, q=subject:"Reliability engineering"]PLOShttps://journals.plos.org/plosone/webmaster@plos.orgaccelerating the publication of peer-reviewed sciencehttps://journals.plos.org/plosone/search/feed/atom?sortOrder=DATE_NEWEST_FIRST&unformattedQuery=subject:%22Reliability+engineering%22&sort=Date,+newest+firstAll PLOS articles are Open Access.https://journals.plos.org/plosone/resource/img/favicon.icohttps://journals.plos.org/plosone/resource/img/favicon.ico2024-03-28T19:43:44ZReliability analysis using Limit equilibrium and Smoothed Particle Hydrodynamics-based method for homogeneous soil slopesXuesong ChuJiahui WenLiang Li10.1371/journal.pone.03002932024-03-11T14:00:00Z2024-03-11T14:00:00Z<p>by Xuesong Chu, Jiahui Wen, Liang Li</p>
This paper develops a combined method to predict the volume of sliding mass for homogeneous slopes in an efficient manner. Firstly, the failure surface with minimum factor of safety (FS) in Limit Equilibrium Method is equated to that one determined by Smoothed Particle Hydrodynamics algorithm to obtain the threshold displacement value for unstable and stable particles. Secondly, the threshold displacement value is used to identify the volume of sliding mass using SPH. Finally, a regression model is developed based on a finite number of SPH simulations for homogeneous soil slopes. The proposed LEM-SPH based method is illustrated through a cohesive soil slope. It is concluded that the use of failure surface with minimum FS in LEM tends to underestimate the volume of sliding mass and to give an unconservative risk value. The Coefficient of Variation (Cov) of volume of sliding mass are 0.14, 0.28, 0.4, 0.48, 0.53 for Cov of soil properties = 0.2, 0.3, 0.4, 0.5, and 0.6, respectively. The uncertainty of soil properties has a significant effect on the mean value of volume of sliding mass and therefore the landslide risk value. The proposed method is necessitated for cases where large uncertainties in soil properties exist.Stochastic modeling and parameter estimation of turbogenerator unit of a thermal power plant under classical and Bayesian inferential frameworkAshish KumarRavi ChaudharyKapil KumarMonika SainiDinesh Kumar SainiPunit Gupta10.1371/journal.pone.02921542023-10-20T14:00:00Z2023-10-20T14:00:00Z<p>by Ashish Kumar, Ravi Chaudhary, Kapil Kumar, Monika Saini, Dinesh Kumar Saini, Punit Gupta</p>
The work reported in present study deals with the development of a novel stochastic model and estimation of parameters to assess reliability characteristics for a turbogenerator unit of thermal power plant under classical and Bayesian frameworks. Turbogenerator unit consists of five components namely turbine lubrication, turbine governing, generator oil system, generator gas system and generator excitation system. The concepts of cold standby redundancy and Weibull distributed random variables are used in development of stochastic model. The shape parameter for all the random variables is same while scale parameter is different. Regenerative point technique and semi-Markov approach are used for evaluation of reliability characteristics. Sufficient repair facility always remains available in plant as well as repair done by the repairman is considered perfect. As the life testing experiments are time consuming, so to highlight the importance of proposed model Monte Carlo simulation study is carried out. A comparative analysis is done between true, classical and Bayesian results of MTSF, availability and profit function.Excalibur: A new ensemble method based on an optimal combination of aggregation tests for rare-variant association testing for sequencing dataSimon BoutryRaphaël HelaersTom LenaertsMiikka Vikkula10.1371/journal.pcbi.10114882023-09-14T14:00:00Z2023-09-14T14:00:00Z<p>by Simon Boutry, Raphaël Helaers, Tom Lenaerts, Miikka Vikkula</p>
The development of high-throughput next-generation sequencing technologies and large-scale genetic association studies produced numerous advances in the biostatistics field. Various aggregation tests, i.e. statistical methods that analyze associations of a trait with multiple markers within a genomic region, have produced a variety of novel discoveries. Notwithstanding their usefulness, there is no single test that fits all needs, each suffering from specific drawbacks. Selecting the right aggregation test, while considering an unknown underlying genetic model of the disease, remains an important challenge. Here we propose a new ensemble method, called Excalibur, based on an optimal combination of 36 aggregation tests created after an in-depth study of the limitations of each test and their impact on the quality of result. Our findings demonstrate the ability of our method to control type I error and illustrate that it offers the best average power across all scenarios. The proposed method allows for novel advances in Whole Exome/Genome sequencing association studies, able to handle a wide range of association models, providing researchers with an optimal aggregation analysis for the genetic regions of interest.Service reliability evaluation of highway tunnel based on digital image processingChunquan DaiZhaochen ZhouHuidi ZhangKun JiangHaisheng LiHaiyang Yu10.1371/journal.pone.02886332023-08-11T14:00:00Z2023-08-11T14:00:00Z<p>by Chunquan Dai, Zhaochen Zhou, Huidi Zhang, Kun Jiang, Haisheng Li, Haiyang Yu</p>
China is gradually transitioning from the "tunnel construction exploration era" to the "tunnel high-quality construction and operation era", and the maintenance demand of highway tunnels has increased sharply. Therefore, there is an urgent need for an evaluation method to evaluate the service reliability of highway tunnels, so as to provide reference for tunnel maintenance personnel to carry out maintenance work. Taking highway tunnels as the research object, this paper extracts three parameters, including length, maximum width and fractal dimension, from the binary image of highway tunnel lining cracks. The standard for dividing the length of the highway tunnel section is 500m as the tunnel section, and a section disease sample space including multiple highway tunnels is constructed. The EM clustering algorithm was used to determine the number of graded grades of disease, and the relative Euclidean distance was used as the evaluation index to divide the safety grade of the tunnel into five grades: normal, degraded, inferior, deteriorated and hazardous. The partial least squares method is used to establish the lining service reliability evaluation formula and verify the residual of each sample point in the sample space. The smaller the average value of the residual, the better fitting effect of the established evaluation formula. The service reliability evaluation method proposed in this paper is applied to engineering practice and compared with the expert scoring method and the national standard method, which proves that the evaluation method in this paper has the advantages of strong visibility, simple evaluation method, and is conducive to engineering practice.Inference on the stress strength reliability with exponentiated generalized Marshall Olkin-G distributionNeama Salah Youssef Temraz10.1371/journal.pone.02801832023-08-09T14:00:00Z2023-08-09T14:00:00Z<p>by Neama Salah Youssef Temraz</p>
In this paper, an inference on stress-strength reliability model is introduced in case of the exponentiated generalized Marshall Olkin G family of distributions. The maximum likelihood estimator of the stress-strength reliability function is deduced. An asymptotic confidence and bootstrap confidence intervals for the stress-strength reliability function are derived. A Bayesian inference is introduced for the stress-strength reliability. A simulation is introduced to obtain the maximum likelihood and Bayesian estimates for the stress strength reliability. Real data applications are provided to show the results for the stress-strength model and compare the exponentiated generalized Marshall Olkin-G distribution with other distributions.A different method of fault feature extraction under noise disturbance and degradation trend estimation with system resilience for rolling bearingsBaoshan ZhangJilian GuoFeng ZhouXuan WangShengjun Wei10.1371/journal.pone.02875442023-07-06T14:00:00Z2023-07-06T14:00:00Z<p>by Baoshan Zhang, Jilian Guo, Feng Zhou, Xuan Wang, Shengjun Wei</p>
Due to the effects of noise disturbances and system resilience, the current methods for rolling bearing fault feature extraction and degradation trend estimation can hardly achieve more satisfactory results. To address the above issues, we propose a different method for fault feature extraction and degradation trend estimation. Firstly, we preset the Bayesian inference criterion to evaluate the complexity of the denoised vibration signal. When its complexity reaches a minimum, the noise disturbances are exactly removed. Secondly, we define the system resilience obtained by the Bayesian network as the intrinsic index of the system, which is used to correct the equipment degradation trend obtained by the multivariate status estimation technique. Finally, the effectiveness of the proposed method is verified by the completeness of the extracted fault features and the accuracy of the degradation trend estimation over the whole life cycle of the bearing degradation data.Solving the general 3-D safety factor by combining Sarma’s idea with the assumption of normal stress distribution over the slip surfaceLinghui WangKunlin Lu10.1371/journal.pone.02879982023-06-29T14:00:00Z2023-06-29T14:00:00Z<p>by Linghui Wang, Kunlin Lu</p>
This study proposes a method for determining 3-D limit equilibrium solutions. The method, inspired by Sarma, introduces the horizontal seismic coefficient as a slope failure parameter and implements a modification of the normal stress over the slip surface. Four equilibrium equations are used to solve the problem without compromising the accuracy of the calculations: three force equilibrium equations in the x, y, and z directions and a moment equilibrium equation in the vertical (z) direction. The reliable factor of safety can be determined by calculating the minimum value of the horizontal seismic coefficient. Furthermore, we analyzed several typical examples of symmetric and asymmetric slopes, finding good consistency with the existing literature. This consistency indicates the reliability of the factor of safety we obtained. The proposed method is favored due to its straightforward principle, convenient operation, fast convergence, and ease of programming.Benchmarks of production for atmospheric water generators in the United StatesErica SadowskiEric MbonimpaChristopher M. Chini10.1371/journal.pwat.00001332023-06-08T14:00:00Z2023-06-08T14:00:00Z<p>by Erica Sadowski, Eric Mbonimpa, Christopher M. Chini</p>
Atmospheric Water Generators (AWG) extract water from the air using one of three available technologies: refrigeration, sorption, and fog harvesting. In this research, we analyze two refrigeration-based devices and one sorption-based device and their efficacy in providing supplemental water supply across the United States. An AWG can supply potable water to remote and austere locations where clean drinking water might otherwise be unavailable. With increasing water scarcity globally, particularly in historically arid climates, new methods that can draw from an estimated 13,000 km<sup>3</sup> in the atmosphere using an AWG becomes important and, potentially, viable. However, due to climatological and technological constraints, not all regions in the world would see the same water production from an AWG as production is driven by high relative humidity and temperature. This climatological reliance also subjects them to dramatic changes in performance depending on the season. By using previously established hydrologic performance indicators (reliability, resilience, and vulnerability) and weather data for the United States, we determine the year-round efficiency metrics of three AWGs. By evaluating three different devices and mapping the efficiency across the United States, this research determines the regional efficacy, as a function of water production, in adopting AWG technology to supplement potable water supply. This study provides important insights into the performance of AWGs with high spatial resolution through comparison of multiple devices. The results indicate minimal viability for a large portion of the United States. However, we highlight the potential for the device to supply water for a remote military installation in Hawaii.Availability optimization of power generating units used in sewage treatment plants using metaheuristic techniquesMonika SainiAshish KumarDinesh Kumar SainiPunit Gupta10.1371/journal.pone.02848482023-05-04T14:00:00Z2023-05-04T14:00:00Z<p>by Monika Saini, Ashish Kumar, Dinesh Kumar Saini, Punit Gupta</p>
Metaheuristic techniques have been utilized extensively to predict industrial systems’ optimum availability. This prediction phenomenon is known as the NP-hard problem. Though, most of the existing methods fail to attain the optimal solution due to several limitations like slow rate of convergence, weak computational speed, stuck in local optima, etc. Consequently, in the present study, an effort has been made to develop a novel mathematical model for power generating units assembled in sewage treatment plants. Markov birth-death process is adopted for model development and generation of Chapman-Kolmogorov differential-difference equations. The global solution is discovered using metaheuristic techniques, namely genetic algorithm and particle swarm optimization. All time-dependent random variables associated with failure rates are considered exponentially distributed, while repair rates follow the arbitrary distribution. The repair and switch devices are perfect and random variables are independent. The numerical results of system availability have been derived for different values of crossover, mutation, several generations, damping ratio, and population size to attain optimum value. The results were also shared with plant personnel. Statistical investigation of availability results justifies that particle swarm optimization outdoes genetic algorithm in predicting the availability of power-generating systems. In present study a Markov model is proposed and optimized for performance evaluation of sewage treatment plant. The developed model is one that can be useful for sewage treatment plant designers in establishing new plants and purposing maintenance policies. The same procedure of performance optimization can be adopted in other process industries too.Data analysis for COVID-19 deaths using a novel statistical model: Simulation and fuzzy applicationEl-Sayed A. El-SherpienyEhab M. AlmetwallyAbdisalam Hassan MuseEslam Hussam10.1371/journal.pone.02836182023-04-10T14:00:00Z2023-04-10T14:00:00Z<p>by El-Sayed A. El-Sherpieny, Ehab M. Almetwally, Abdisalam Hassan Muse, Eslam Hussam</p>
This paper provides a novel model that is more relevant than the well-known conventional distributions, which stand for the two-parameter distribution of the lifetime modified Kies Topp–Leone (MKTL) model. Compared to the current distributions, the most recent one gives an unusually varied collection of probability functions. The density and hazard rate functions exhibit features, demonstrating that the model is flexible to several kinds of data. Multiple statistical characteristics have been obtained. To estimate the parameters of the MKTL model, we employed various estimation techniques, including maximum likelihood estimators (MLEs) and the Bayesian estimation approach. We compared the traditional reliability function model to the fuzzy reliability function model within the reliability analysis framework. A complete Monte Carlo simulation analysis is conducted to determine the precision of these estimators. The suggested model outperforms competing models in real-world applications and may be chosen as an enhanced model for building a statistical model for the COVID-19 data and other data sets with similar features.Validity and intrarater reliability of a novel device for assessing Plantar flexor strengthSeth O’NeillAlice WeeksJens Eg NørgaardMartin Gronbech Jorgensen10.1371/journal.pone.02823952023-03-31T14:00:00Z2023-03-31T14:00:00Z<p>by Seth O’Neill, Alice Weeks, Jens Eg Nørgaard, Martin Gronbech Jorgensen</p>
Introduction <p>Plantar flexor weakness is an identified prospective factor for developing Achilles tendinopathy. Various authors have reported relationships between symptoms and weakness of this muscle group. Despite this relationship, many clinicians and researchers fail to examine Plantar flexor strength due to the cumbersome, stationary and expensive nature of an isokinetic dynamometer (IKD), known as the “Gold Standard”. This study examined the validity and reliability of a fast, easy and portable device for assessing plantarflexion.</p> Methods <p>Validity between the Cybex NORM® by Humac and the C-Station by Fysiometer was explored using Pearson correlation coefficient. Participants were randomly selected to start in the Cybex NORM® or the FysioMeter C-Station. Intra-rater reliability on the C-station was investigated by test-retest two days apart using Intraclass Correlation Coefficient (ICC). All testing involved isometric maximal force of the soleus muscle with the knee at 90 degrees flexion.</p> Results <p>40 healthy university students were recruited for the validity part, while 65 healthy university students were recruited for the reliability part of the study. The mean peak torque on the IKD was 198.55Nm (SD 94.45) versus 1443.88 (412.82)N on the C-Station. The results of the Pearson correlation revealed an r-value of r = 0.72 with a 95%CI 0.52–0.84. The test re-test reliability was calculated as an ICC of 0.91 with a (95%CI 0.86–0.94).</p> Conclusions <p>The C-Station by Fysiometer appears to provide valid measures and have excellent reliability for Plantar flexor isometric strength. It would appear suitable for both clinical and research work.</p>Strategy and additive technologies as the catalyst for outsourcing, process innovation and operational effectivenessThomas TegethoffRicardo SantaEdgardo CayónAnnibal Scavarda10.1371/journal.pone.02823662023-02-27T14:00:00Z2023-02-27T14:00:00Z<p>by Thomas Tegethoff, Ricardo Santa, Edgardo Cayón, Annibal Scavarda</p>
Purpose <p>There is rising interest in Industry 4.0 as a factor in the competitiveness of the organization. Although many companies are aware of the importance of Industry 4.0, the development of such initiatives in Colombia is slow. Consequently, this research investigates the impact of additive technologies as part of the Industry 4.0 concept on operational effectiveness and, therefore, the competitiveness of the organization and tries to establish the factors that hinder the adequate implementation of such new, innovative technologies.</p> Design/Methodology/Approach <p>Structural equation modeling was used to analyze the antecedents and outcomes of operational effectiveness. To this end, 946 usable questionnaires were collected from managers and personnel from Colombian organizations.</p> Findings <p>Initial findings show that management is aware of Industry 4.0 concepts and implements strategies for such initiatives. Nevertheless, neither process innovation nor additive technologies have a significant impact on operational effectiveness and therefore on the competitiveness of the organization.</p> Practical implications <p>The implementation of new innovative technologies requires the closure of the digital gap between urban and rural areas and between large and medium and small enterprises. Similarly, the concept of Industry 4.0 as a new, innovative manufacturing concept requires a transversal implementation to increase the competitiveness of the organization.</p> Originality/Value <p>The value of this paper lies in discussing the current technological and human capabilities and strategies that Colombian organizations, as an example of a developing nation, should improve to leverage the benefits of Industry 4.0 to remain competitive. The results are probably generalizable to other regions in developing countries throughout the world.</p>Design of Active Fault-Tolerant Control System for Air-Fuel Ratio control of Internal Combustion engine using nonlinear regression-based observer modelTurki AlsuwianArslan Ahmed AminMuhammad Sajid IqbalMuhammad Bilal QadirSaleh AlmasabiMohammed Jalalah10.1371/journal.pone.02791012022-12-15T14:00:00Z2022-12-15T14:00:00Z<p>by Turki Alsuwian, Arslan Ahmed Amin, Muhammad Sajid Iqbal, Muhammad Bilal Qadir, Saleh Almasabi, Mohammed Jalalah</p>
Internal Combustion (IC) engines are prevalent in the process sector, and maintaining sufficient Air-Fuel Ratio (AFR) regulation in their fuel system is crucial for enhanced engine performance, fuel economy, and environmental safety. Faults in the AFR system’s sensors cause the engine to shut down, hence, fault tolerance is essential. In order to avoid engine shutdown, this paper offers a novel Active Fault-Tolerant Control System (AFTCS) for air-fuel ratio control of an Internal Combustion (IC) engine in a process plant. In the Fault Detection and Isolation (FDI) unit, the proposed AFTCS uses a nonlinear regression-based observer model for analytical redundancy. The suggested system was simulated in the MATLAB / Simulink environment. The proposed system was tested at two different speeds (300 r/min and 600 r/min) and the results show that the system’s response is within the acceptable bound without compromising the stability. The findings also demonstrate the higher fault tolerance capability for sensor defects of the AFR control system, particularly for the MAP sensor (at 300 r/min) in terms of reduced oscillatory response in comparison to the current literature. Compared to the linear regression-based and Genetic Algorithm (GA) based model, the nonlinear regression-based model results in a more accurate estimation of the faulty sensors. The proposed model is also efficient in terms of computation power and response time.Study on connectivity of buried pipeline network considering nodes reliability under seismic actionDelong HuangZhongling ZongAiping Tang10.1371/journal.pone.02715332022-08-22T14:00:00Z2022-08-22T14:00:00Z<p>by Delong Huang, Zhongling Zong, Aiping Tang</p>
Currently, the connectivity calculation of complex pipeline networks is mostly simplified or ignores the influence of nodes such as elbows and tees on the connectivity reliability of the entire network. Historical earthquake damage shows that the seismic performance of municipal buried pipelines depends on the ability of nodes and interfaces to resist deformation. The influence of node reliability on network connectivity under reciprocal loading is a key issue to be addressed. Therefore, based on the general connectivity probabilistic analysis algorithm, this paper embeds the reliability of nodes into the reliability of edges, and derives a more detailed and comprehensive on-intersecting minimum path recursive decomposition algorithm considering elbows, tees, and other nodes; then, based on the reliability calculation theory of various pipeline components, the reliability of various nodes in different soil is calculated using finite element numerical simulation; finally, the reliability of a small simple pipeline network and a large complex pipeline network are used as examples to reveal the importance of considering nodes in the connectivity calculation of pipeline network. The reliability of the network system decreases significantly after considering the nodes such as elbows and tees. The damage of one node usually causes the failure of the whole pipes of the path. The damage probability is greater in the area with dense elbow and tee nodes. In this study, all types of nodes that are more prone to damage are considered in detail in the calculation. As a result, the proposed algorithm has been improved in computational accuracy, which lays the foundation for further accurate calculation of pipeline network connectivity.The influence of the pavement friction coefficient evolution caused by traffic flow on the risk of motorway horizontal curvesGuilong XuJinliang XuHuagang ShanChao GaoJinsong RanYongji MaYuhong Yao10.1371/journal.pone.02665192022-08-22T14:00:00Z2022-08-22T14:00:00Z<p>by Guilong Xu, Jinliang Xu, Huagang Shan, Chao Gao, Jinsong Ran, Yongji Ma, Yuhong Yao</p>
The friction coefficient between the tire and the road is one of the key parameters affecting road traffic safety. The purpose of this paper is to quantify the risk of skidding for the vehicles due to the friction evolution caused by the traffic polishing in the horizontal curve. Based on the reliability theory, an innovative dynamic risk assessment model is developed in the present study for passenger cars and trucks. The influence of two traffic characteristics for pavement friction is quantified: cumulative traffic volume (CTV) and annual average daily traffic of trucks (AADTT). The speed distribution on the horizontal curve of the motorway is obtained through field experiments as the basic parameter of the model. The Hasofer-Lind Method is adopted to solve the reliability and the risk probability of vehicle skidding. The results show that in the traffic characteristics, the AADTT has a significant impact on the pavement friction; When the AADTT on the road exceeds 2000 veh/d, the increasing CTV leads to friction decrease rapidly and therefore has a significant impact on the risk of horizontal curve. Especially for roads with more than 50 million vehicles of the CTV, the risk of the horizontal curve shows a sharp increase with CTV rising. The research results can provide reference for the road maintenance department to determine the timing of road maintenance.