To main content

Optimal preventive policies for parallel systems using Markov decision process: application to an offshore power plant

Abstract

This work proposes a Markov Decision Process (MDP) model for identifying windows of opportunities to perform preventive maintenance for multi-unit parallel systems subject to a varying demand. The main contribution lies in proposing: (i) a reward function that does not depend on maintenance costs, which are typically difficult to assess and classify; and (ii) a new metric for prevention.
By optimizing the capacity utilization rate and the decision flexibility, which is denoted in terms of standby units, for a set of typical operational scenarios, the optimal opportunities for preventive interventions are identified within the respective prevention ranges, in relation to an offshore power plant (case study). The sequential decision problem is solved using the Value Iteration algorithm to obtain the optimal long-term policies. As a result, a backlog management decision-support solution is developed, using a low-cost computational model,
which provides scenario-dependent preventive policies and promotes the integration of operations with maintenance, being easy to implement, maintain and communicate with stakeholders.

Category

Academic article

Language

English

Author(s)

  • Mario Machado
  • Thiago Lima Silva
  • Eduardo Camponogara
  • Edilson de Arruda
  • Virgílio Ferreira Filho

Affiliation

  • Federal University of Rio de Janeiro
  • Petrobras
  • Norwegian University of Science and Technology
  • SINTEF Industry / Sustainable Energy Technology
  • Federal University of Santa Catarina
  • Universidade Federal Do Estado do Rio De Janeiro

Year

2023

Published in

EURO Journal on Decision Processes

ISSN

2193-9438

Publisher

Elsevier

Volume

11

View this publication at Cristin