A dictionary definition of the Markov chain is:
‘A sequence of events, the probability for each of which is dependent on the event immediately preceding it.’
Markov models consist of comprehensive representations of possible chains of events, i.e., transitions, within systems, which in the case of reliability and availability analysis correspond to sequences of failures and repair.
The Markov model examines the probability of being in a given state at a given point in time, the amount of time a system is expected to spend in a given state, as well as the expected number of transitions between states. For example, this is useful when considering failures and repairs.
Markov models allow for a detailed representation of failure and repair processes, particularly when dependencies are involved.
Markov Analysis is well-suited to handle rare events, unlike simulation-based analyses, and therefore allows such events to be analyzed within a reasonable amount of time.
Markov models can also be applied in any situation where distinct states and transitions between them are known.
Sometimes these states are clear opposites like "Working" versus "Failed", or "Good" versus "Bad" states, but most times there are many in between states that can also be accounted for using Markov models.
Markov analysis is a technique used to obtain numerical measures related to the probability of a given state, reliability and availability of a system or part of a system.
Markov analysis is performed when dependencies between the failure of multiple components, as well as dependencies between component failures and failure rates, can not be easily represented using a combination of fault trees and other techniques.
There are three steps:
Both continuous and discrete transitions can be introduced into the model.
Continuous transitions are those representing events that can take place at any time within a given time interval, whereas discrete transitions take place at a specified point in time.