×

Non-parametric measure approximations for constrained multi-objective optimisation under uncertainty. (English) Zbl 07845190

Summary: In this article, we propose non-parametric estimations of robustness and reliability measures approximation error, employed in the context of constrained multi-objective optimisation under uncertainty (OUU). These approximations with tunable accuracy permit to capture the Pareto front in a parsimonious way, and can be exploited within an adaptive refinement strategy. First, we illustrate an efficient approach for obtaining joint representations of the robustness and reliability measures, allowing sharper discrimination of Pareto-optimal designs. A specific surrogate model of these objectives and constraints is then proposed to accelerate the optimisation process. Secondly, we propose an adaptive refinement strategy, using these tunable accuracy approximations to drive the computational effort towards the computation of the optimal area. To this extent, an adapted Pareto dominance rule and Pareto optimal probability computation are formulated. The performance of the proposed strategy is assessed on several analytical test-cases against classical approaches. We also illustrate the method on an engineering application, performing shape OUU of an Organic Rankine Cycle turbine.
© 2024 John Wiley & Sons, Ltd.

MSC:

90C29 Multi-objective and goal programming
90C17 Robustness in mathematical programming

Software:

GPy
Full Text: DOI

References:

[1] ThoreCJ, Alm GrundströmH, KlarbringA. Game formulations for structural optimization under uncertainty. Int J Numer Methods Eng. 2020;121(1):165‐185. · Zbl 07841257
[2] CookLW, JarrettJP, WillcoxKE. Generalized information reuse for optimization under uncertainty with non‐sample average estimators. Int J Numer Methods Eng. 2018;115(12):1457‐1476. · Zbl 07865116
[3] WangX, HirschC, LiuZ, KangS, LacorC. Uncertainty‐based robust aerodynamic optimization of rotor blades. Int J Numer Methods Eng. 2013;94(2):111‐127. · Zbl 1352.74232
[4] JungS, OkSY, SongJ. Robust structural damage identification based on multi‐objective optimization. Int J Numer Methods Eng. 2010;81(6):786‐804. · Zbl 1183.74084
[5] ZitzlerE, KalyanmoyD, ThieleL. Comparison of multiobjective evolutionary algorithms: empirical results. Evolut Comput. 2000;8(2):173‐195.
[6] JinY, BrankeJ, et al. Evolutionary optimization in uncertain environments‐a survey. IEEE Trans Evolut Comput. 2005;9(3):303‐317.
[7] GohCK, TanKC. Evolutionary multi‐objective optimization in uncertain environments. Issues Algor, Stud Comput Intell. 2009;186:5‐18. · Zbl 1226.90004
[8] HoCP, ParpasP. Multilevel Optimization Methods: Convergence and Problem Structure. Department of Computing, Imperial College; 2016.
[9] El‐BeltagyM, KeaneA. Optimisation for multilevel problems: A comparison of various algorithms. Adaptive Computing in Design and Manufacture. Springer; 1998:111‐120.
[10] MarchA, WillcoxK. Provably convergent multifidelity optimization algorithm not requiring high‐fidelity derivatives. AIAA J. 2012;50(5):1079‐1089.
[11] ZahirMK, GaoZ. Variable‐fidelity optimization with design space reduction. Chinese J Aeronaut. 2013;26(4):841‐849.
[12] NgLWT, WillcoxKE. Multifidelity approaches for optimization under uncertainty. Int J Numer Methods Eng. 2014;100(10):746‐772. · Zbl 1352.74230
[13] PichenyV, GinsbourgerD, RichetY. Noisy expected improvement and on‐line computation time allocation for the optimization of simulators with tunable fidelity. 2nd International Conference on Engineering Optimization, September 6‐9, 2010, Lisbon, Portugal; 2010.
[14] TeichJ. Pareto‐front exploration with uncertain objectives. International Conference on Evolutionary Multi‐Criterion Optimization. Springer; 2001:314‐328.
[15] EskandariH, GeigerCD, BirdR. Handling uncertainty in evolutionary multiobjective optimization: SPGA. 2007 IEEE Congress on Evolutionary Computation. IEEE; 2007:4130‐4137.
[16] GongDW, QinNN, SunXY. Evolutionary algorithms for multi‐objective optimization problems with interval parameters. 2010 IEEE Fifth International Conference on Bio‐Inspired Computing: Theories and Applications (BIC‐TA). IEEE; 2010:411‐420.
[17] SoaresGL, GuimarãesFG, MaiaCA, VasconcelosJA, JaulinL. Interval robust multi‐objective evolutionary algorithm. 2009 IEEE Congress on Evolutionary Computation. IEEE; 2009:1637‐1643.
[18] KhosraviF, RaßA, TeichJ. Efficient Computation of Probabilistic Dominance in Robust Multi‐Objective Optimization. arXiv:191008413 2019.
[19] KhosraviF, RassA, TeichJ. Efficient computation of probabilistic dominance in multi‐objective optimization. ACM Trans Evol Learn Optim. 2021;oct;1(4):1‐26. doi:10.1145/3469801
[20] MlakarM, TušarT, FilipičB. Comparing solutions under uncertainty in multiobjective optimization. Math Probl Eng. 2014;2014;1‐10. · Zbl 1407.90299
[21] FusiF, CongedoPM. An adaptive strategy on the error of the objective functions for uncertainty‐based derivative‐free optimization. J Comput Phys. 2016;309:241‐266. · Zbl 1351.90166
[22] RivierM, CongedoPM. Surrogate‐assisted bounding‐Box approach applied to constrained multi‐objective optimisation under uncertainty. Reliab Eng Syst Saf. 2022;217:108039https://www.sciencedirect.com/science/article/pii/S0951832021005445
[23] QingJ, DhaeneT, CouckuytI. Spectral representation of robustness measures for optimization under input uncertainty. In ICML2022, the 39th International Conference on Machine Learning. Vol 162. 2022:1‐26. https://icml.cc/virtual/2022/poster/17741
[24] MutnyM, KrauseA. Efficient high dimensional bayesian optimization with additivity and quadrature fourier features. In: BengioS (ed.), WallachH (ed.), LarochelleH (ed.), GraumanK (ed.), Cesa‐BianchiN (ed.), GarnettR (ed.), eds. Advances in Neural Information Processing Systems. Vol 31. Curran Associates, Inc.; 2018. https://proceedings.neurips.cc/paper.files/paper/2018/file/4e5046fc8d6a97d18a5f54beaed54dea‐Paper.pdf
[25] MaddoxWJ, BalandatM, WilsonAG, BakshyE. Bayesian optimization with high‐dimensional outputs. Advances in Neural Information Processing Systems. Vol 34. 2021:19274‐19287.
[26] BogunovicI, ScarlettJ, JegelkaS, CevherV. Adversarially Robust Optimization with Gaussian Processes. Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS’18). Curran Associates, Inc; 2018:5765‐5775.
[27] InatsuY, TakenoS, KarasuyamaM, TakeuchiI. Bayesian optimization for distributionally robust chance‐constrained problem. International Conference on Machine Learning. PMLR; 2022:9602‐9621. https://api.semanticscholar.org/CorpusID:246430364
[28] QingJ, CouckuytI, DhaeneT, A robust multi‐objective bayesian optimization framework considering input uncertainty. Journal of Global Optimization. 2022;86(3):693‐711. · Zbl 1530.90094
[29] NguyenQP, DaiZ, LowBKH, JailletP. Optimizing conditional value‐at‐risk of Black‐Box functions. In: RanzatoM (ed.), BeygelzimerA (ed.), DauphinY (ed.), LiangPS (ed.), VaughanJW (ed.), eds. Advances in Neural Information Processing Systems. Vol 34. Curran Associates, Inc.; 2021:4170‐4180. https://proceedings.neurips.cc/paper.files/paper/2021/file/219ece62fae865562d4510ea501cf349‐Paper.pdf
[30] TorossianL, PichenyV, DurrandeN. Bayesian quantile and expectile optimisation. Conference on Uncertainty in Artificial Intelligence. PMLR; 2020:1623‐1633. https://api.semanticscholar.org/CorpusID:210472640
[31] AzzimontiD, BectJ, ChevalierC, GinsbourgerD. Quantifying uncertainties on excursion sets under a gaussian random field prior. SIAM/ASA J Uncertain Quantif. 2016;4(1):850‐874. · Zbl 1352.62144
[32] Da VeigaS, DelbosF. Robust optimization for expensive simulators with surrogate models: Application to well placement for oil recovery. Saf, Reliab, Risk Life‐Cycle Perform Struct Infrastruct. 2013;11:3321‐3328.
[33] BaudouiV. Optimisation Robuste multiobjectifs par modèles de substitution. PhD thesis. Toulouse, ISAE; 2012.
[34] JanusevskisJ, Le RicheR. Simultaneous kriging‐based estimation and optimization of mean response. J Global Optim. 2013;55(2):313‐336. · Zbl 1287.90043
[35] WilliamsBJ, SantnerTJ, NotzWI. Sequential design of computer experiments to minimize integrated response functions. Statist Sin. 2000;10:1133‐1152. · Zbl 0961.62069
[36] CoelhoRF. Probabilistic dominance in multiobjective reliability‐based optimization: Theory and implementation. IEEE Trans Evolut Comput. 2014;19(2):214‐224.
[37] KhosraviF, BorstM, TeichJ. Probabilistic dominance in robust multi‐objective optimization. In: 2018 IEEE Congress on Evolutionary Computation (CEC). IEEE; 2018:1‐6.
[38] MossHB, OberSW, PichenyV, Inducing point allocation for sparse gaussian processes in high‐throughput bayesian optimisation. International Conference on Artificial Intelligence and Statistics. PMLR; 2023:5213‐5230.
[39] TitsiasMK. Variational learning of inducing variables in sparse Gaussian processes. International Conference on Artificial Intelligence and Statistics. PMLR; 2009:567‐574. https://api.semanticscholar.org/CorpusID:7811257
[40] WilsonAG, NickischH. Kernel Interpolation for Scalable Structured Gaussian Processes. (KISS‐GP); 2015.
[41] GPy. GPy: A Gaussian process framework in python; since. 2012http://github.com/SheffieldML/GPy
[42] RasmussenCE, WilliamsCKI. Gaussian Processes for Machine Learning. Adaptive Computation and Machine Learning. MIT Press; 2006.
[43] PowellCE et al. Generating realisations of stationary Gaussian random fields by circulant embedding. Matrix. 2014;2(2):1.
[44] PourahmadiM. Covariance estimation: The GLM and regularization perspectives. Statist Sci. 2011;26(3):369‐387. · Zbl 1246.62139
[45] LantuéjoulC, DesassisN. Simulation of a Gaussian random vector: a propagative version of the Gibbs sampler. The 9th International Geostatistics Congress; 2012:174‐181.
[46] PanunzioAM, CottereauR, PuelG. Large scale random fields generation using localized Karhunen-Loève expansion. Adv Model Simul Eng Sci. 2018;5(1):20.
[47] ChevalierC, EmeryX, GinsbourgerD. Fast update of conditional simulation ensembles. Math Geosci. 2015 Oct;47(7):771‐789. · Zbl 1323.86020
[48] HerdinM, CzinkN, OzcelikH, BonekE. Correlation matrix distance, a meaningful measure for evaluation of non‐stationary MIMO channels. In: 2005 IEEE 61st Vehicular Technology Conference. Vol 1. IEEE; 2005:136‐140.
[49] RivierM, CongedoPM. Surrogate‐assisted Bounding‐Box approach for optimization problems with tunable objectives fidelity. J Global Optim. 2019;75(4):1079‐1109. · Zbl 1432.90097
[50] DubuissonMP, JainAK. A modified Hausdorff distance for object matching. Proceedings of 12th International Conference on Pattern Recognition. Vol 1. IEEE; 1994:566‐568.
[51] PiniM, PersicoG, PasqualeD, RebayS. Adjoint method for shape optimization in real‐gas flow applications. ASME J Eng Gas Turb Power. 2015;137(3):1‐13.
[52] VitaleS, AlbringTA, PiniM, GaugerNR, ColonnaP. Fully turbulent discrete adjoint solver for non‐ideal compressible flow applications. J Global Power Propuls Soc. 2017;1:Z1FVOI.
[53] PersicoG, Rodriguez‐FernandezP, RomeiA. High‐fidelity shape‐optimization of non‐conventional turbomachinery by surrogate evolutionary strategies. J Turbomach. 2019;141(8):081010.
[54] PiniM, PersicoG, DossenaV. Robust adjoint‐based shape optimization of supersonic turbomachinery cascades. Turbo Expo: Power for Land, Sea, and Air. Vol 45615. American Society of Mechanical Engineers; 2014:V02BT39A043.
[55] RazaalyN, GoriG, IaccarinoG, CongedoP. Optimization of an ORC supersonic nozzle under epistemic uncertainties due to turbulence models. GPPS. Vol 2019. Global Power and Propulsion Society; 2019.
[56] RazaalyN, PersicoG, CongedoPM. Impact of geometric, operational, and model uncertainties on the non‐ideal flow through a supersonic ORC turbine cascade. Energy. 2019;169:213‐227.
[57] CongedoPM, GeraciG, AbgrallR, PedirodaV, ParussiniL. TSI metamodels‐based multi‐objective robust optimization. Eng Comput (Swansea, Wales). 2013;30(8):1032‐1053.
[58] FarinG. Curves and Surfaces for CAGD: A Practical Guide. 5th ed.Morgan Kaufmann Publishers Inc.; 2002.
[59] PalaciosF, ColonnoMF, AranakeAC, et al. Stanford University Unstructured (SU2): An open‐source integrated computational environment for multi‐physics simulation and design. 51st AIAA Aerospace Sciences Meeting including the New Horizons Forum and Aerospace Exposition; 2013:287.
[60] EconomonTD, MudigereD, BansalG, et al. Performance optimizations for scalable implicit {RANS} calculations with {SU2}. Comput Fluids. 2016;129:146‐158. http://www.sciencedirect.com/science/article/pii/S0045793016300214 · Zbl 1390.76425
[61] PiniM, VitaleS, ColonnaP, GoriG, GuardoneA, et al. SU2: the open‐source software for non‐ideal compressible flows. Journal of Physics: Conference Series. Vol 821. IOP Publishing; 2016:012013.
[62] VitaleS, GoriG, PiniM, GuardoneA, EconomonTD, PalaciosF, et al. Extension of the SU2 open source CFD code to the simulation of turbulent flows of fluids modelled with complex thermophysical laws. No. AIAA Paper 2015‐2760; 2015.
[63] RazaalyN. Rare Event Estimation and Robust Optimization Methods with Applications to ORC Turbine Cascade. PhD thesis. Université Paris‐Saclay; 2019.
[64] GardnerJR, KusnerMJ, XuZE, WeinbergerKQ, CunninghamJP. Bayesian optimization with inequality constraints. International Conference on Machine Learning. PMLR; 2014:937‐945.
[65] EchardB, GaytonN, LemaireM. AK‐MCS: an active learning reliability method combining Kriging and Monte Carlo simulation. Struct Saf. 2011;33(2):145‐154.
[66] RazaalyN, CongedoPM. Novel algorithm using active metamodel learning and importance sampling: application to multiple failure regions of low probability. J Comput Phys. 2018;368:92‐114. · Zbl 1392.74094
[67] WilliamsCK, SeegerM. Using the Nyström method to speed up kernel machines. Advances in Neural Information Processing Systems. MIT Press; 2001:682‐688.
[68] TipireddyR, Barajas‐SolanoDA, TartakovskyAM. Conditional Karhunen‐Loève expansion for uncertainty quantification and active learning in partial differential equation models. arXiv:190408069 2019.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.