MARKOV CHAIN AND THEIR THEORY IN DECISION MAKING.

22
DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING, AHMADU BELLO UNIVERSITY, ZARIA - NIGERIA. TOPIC: 1. MARKOV CHAIN AND THEIR THEORY IN DECISION MAKING. 2. INVENTORY THEORY & MANAGEMENT. COURSE: ENGINEERING MANAGEMENT & DECISION - EEEN509 LECTURER: P.U. OKORIE NAME: AMINU, USMAN KABIR REG NO: U10EE1095 DATE: 24TH FEBRUARY, 2015.

Transcript of MARKOV CHAIN AND THEIR THEORY IN DECISION MAKING.

DEPARTMENT OF ELECTRICAL AND COMPUTER ENGINEERING,

AHMADU BELLO UNIVERSITY, ZARIA - NIGERIA.

TOPIC: 1. MARKOV CHAIN AND THEIR THEORY IN DECISION MAKING.

2. INVENTORY THEORY & MANAGEMENT.

COURSE: ENGINEERING MANAGEMENT & DECISION - EEEN509

LECTURER: P.U. OKORIE

NAME: AMINU, USMAN KABIR

REG NO: U10EE1095

DATE: 24TH FEBRUARY, 2015.

U10EE1095

1

ABSTRACT

Many problems of decision making under uncertainty can be formulated as sequential decision

problems in which a strategy's current state and choice of action determine its next state. Modern

probability theory studies processes for which the knowledge of previous outcomes influences

predictions for future experiments. In principle, when we observe a sequence of chance

experiments, all of the past outcomes could influence our predictions for the next experiment.

Decision making is one of the most important activities in Engineering Management. The success

of a project, business or school is critically linked to effective decisions. Markov chain is

introduced in order to give a mathematical formulation of the decision making process in

Qualitative and Quantitative Analysis. In this paper, I describe and analyze the Markov Chain and

its theory in decision making.

Modern probability theory studies processes for which the knowledge of previous outcomes

influences predictions for future experiments. In principle, when we observe a sequence of chance

experiments, all of the past outcomes could influence our predictions for the next experiment.

U10EE1095

2

INTRODUCTION

In 1907, A. A. Markov began the study of an important new type of chance process. In this process,

the outcome of a given experiment can affect the outcome of the next experiment. This type of

process is called a Markov chain. A general model of a system which moves from one state to state

is described and applied to solve sophisticated problem. It is shown that such systems tend to a

steady-state eventually.

The basis of a Markov model is the assumption that the future is independent of the past, given the

present. This arises from the study of Markov chains – sequences of random variables in which the

future variable is determined by the present variable but is independent of the way in which the

present state arose from its predecessors. Markov analysis looks at a sequence of events and

analyses the tendency of one event to be followed by another. Using this analysis, it is possible to

generate a new sequence of random but related events, which appear similar to the original.

A Markov chain may be described as homogeneous or non-homogeneous. A homogeneous chain is

characterised by constant transition times between states. A non-homogeneous chain is

characterised by transition rates between the states that are functions of a global clock, for example,

elapsed mission time. In Reliability and Maintanability analysis, a Markov model may be used

where events, such as the failure or repair of an item can occur at any point in time. The model

evaluates the probability of moving from a known state to the next logical state, i.e. from

everything working to the first item failed, from the first item failed to the second item failed and so

on until the system has reached the final or totally failed state.

U10EE1095

3

Markov analysis computes the rates at which transitions occur between system states from such

parameters as the element failure rates and/or repair rates. This is then used to compute system

parameters such as MTBF- Mean Time Between Failures, reliability, availability, etc.

TRANSITION MATRIX

The entries in the first row of the matrix P represent the probabilities for the various kinds of

weather following a rainy day. Similarly, the entries in the second and third rows represent the

probabilities for the various kinds of weather following nice and snowy days, respectively. Such a

square array is called the matrix of transition probabilities, or the transition matrix .

We consider the question of determining the probability that, given the chain is in state i today, it

will be in state j two days from now. We denote this probability by Pij. Using the transition matrix

P, we can write this product as P11P13. The Other two events also have probabilities that can be written

as products of entries of P. Thus, we have

P(2)

13 = P11P13 + P12P23 + P13P33 .

This equation should remind the reader of a dot product of two vectors; we are dotting the first row

of P with the third column of P. This is just what is done in obtaining the 1, 3-entry of the product

of P with itself. In general, if a Markov chain has r states, then

Example 1

For Example, In ABU Zaria, during the early 90's, students were admitted at various

campus, Samaru, Kongo and Shika admitted only male students. Assume that, at that time, 80

percent of the sons of Samaru graduates went to Samaru and the rest went to Shika, 40 percent of

the sons of Shika graduates went to Shika, and the rest split evenly between Samaru and Kongo;

U10EE1095

4

S

K

X

and of the sons of Kongo Graduates, 70 percent went to Kongo, 20 percent to Samaru, and 10

percent to Shika.

We form a Markov chain with transition matrix. Let Samaru, Kongo and Shika be denoted by S,K

and X.

S K X

P =

By assuming that the son of a Samaru graduate always went to Samaru. The transition matrix is

now

S K X

P =

MARKOV CHAIN THEOREMS

THEOREM 1

Let P be the transition matrix of a Markov chain. The ijth

entry p(n)

ij of the matrix Pn gives

the probability that the Markov chain, starting in state si, will be in state sj after n steps.

From the above example in ABU Zaria,

S K X

P =

S

K

X

S

K

X

U10EE1095

5

S

K

X

To calculate P2, we square the matrix P, and we obtain,

S K X

P2 =

THEOREM 2

Let P be the transition matrix of a Markov chain, and let u be the probability vector which

represents the starting distribution. Then the probability that the chain is in state si after n steps is

the ith

entry in the vector

u(n)

= uPn

For example, In the Example above in ABU Zaria (Example 1) , if the initial probability vector u

equal (1/3, 1/3, 1/3). Then we can calculate the distribution of the states after two days using

Theorem 2 and our previous calculation of P2. We obtain u

(2) = uP

2

u(2)

= uP2 = (1/3, 1/3, 1/3 )

u(2)

= uP2= (0.4833 0.2133 0.3033)

We note that if we want to examine the behavior of the chain under the assumption that it starts in a

certain state si, we simply choose u to be the probability vector with ith

entry equal to 1 and all other

entries equal to 0.

THEOREM 3 (Probability of Absorption)

In an absorbing Markov chain, the probability that the process will be absorbed is 1 (i.e., Qn → 0 as

n → ∞).

U10EE1095

6

Absorbing markov chains

The subject of Markov chains is best studied by considering special types of Markov chains. The

first type that we shall study is called an absorbing Markov chain.

Definition 1

A state si of a Markov chain is called absorbing if it is impossible to leave it (i.e., pii = 1). A

Markov chain is absorbing if it has at least one absorbing state, and if from every state it is possible

to go to an absorbing state (not necessarily in one step).

Definition 2

In an absorbing Markov chain, a state which is not absorbing is called transient.

THEOREM 4 (The Fundamental Matrix )

For an absorbing Markov chain the matrix I − Q has an inverse N and N = I + Q + Q2 + ··· . The ij-

entry nij of the matrix N is the expected number of times the chain is in state sj , given that it starts

in state si. The initial state is counted if i = j.

Definition 3

For an absorbing Markov chain P, the matrix N = (I − Q)−1

is called the fundamental matrix for P.

The entry nij of N gives the expected number of times that the process is in the transient state sj if it

is started in the transient state si.

THEOREM 5

Let ti be the expected number of steps before the chain is absorbed, given that the chain starts in

state si, and let t be the column vector whose ith

entry is ti. Then

t = Nc ,

U10EE1095

7

where c is a column vector all of whose entries are 1.

If we add all the entries in the ith

row of N, we will have the expected number of times in any of the

transient states for a given starting state si, that is, the expected time required before being absorbed.

Thus, ti is the sum of the entries in the ith

row of N. If we write this statement in matrix form, we

obtain the theorem.

THEOREM 6( Absorption Probability)

Let bij be the probability that an absorbing chain will be absorbed in the absorbing state sj if it starts

in the transient state si. Let B be the matrix with entries bij . Then B is an t-by-r matrix, and

B = NR ,

where N is the fundamental matrix and R is as in the canonical form.

Proof

U10EE1095

8

Example 2

If you have lived in Zaria for a while, you must have realized that the weather is a main concern of

the population economic, academic and social activities. An unofficial study of the weather in the

city in early spring yields the following observations:

1. It is almost impossible to have two nice days in a row

2. If we have a nice day, we are likely to have a sunny or a rainy day next day

3. If we have a sunny or a rainy day, we have an even chance to have the same, next day

4. If there is a change from a sunny or a rainy day, only half of the time is this change to

a nice day.

a. Write the transition matrix to model this system.

b. If it is nice today, what is the probability of being nice after one week?

c. Find the long time behaviour of the weather.

Solution

1) Since the weather tomorrow depends only on today, this is a Markov chain process.

The transition matrix of this system is

where the letters N, R, S represent a Nice, a Rainy, a Sunny day respectively.

2) If it is nice today, then the initial state-vector is

After seven days (one week), the state-vector would be

U10EE1095

9

So, there is about 20% chance of being nice in one week.

3) Notice first that we are dealing with a regular Markov chain since the transition matrix is

regular, so we are sure that the steady-state vector exists. To find it we solve the homogeneous

system (T-I)X=0 which has the following coefficient matrix:

Reducing to reduced echelon form gives

The general solution of this system is

So what solution do we choose? Remember that a steady-state vector is in particular a probability

vector; that is the sum of its components is 1: 0.5t+t+t=1 gives t=0.4.

Thus, the steady-state vector is

In the long term, in Zaria, there is 20% chance of getting a nice day, 40% chance of having a rainy

day and 40% chance of having a sunny day.

U10EE1095

10

ADVANTAGES OF MARKOV CHAIN

1. Simplistic Modelling Approach: The models are simple to generate although they do require a

more complicated mathematical approach.

2. Redundancy Management Techniques: System reconfiguration required by failures is easily

incorporated in the model.

3. Coverage: Covered (detected and isolated) and uncovered (undetected) failures of components

are mutually exclusive events, not easily modelled using classical techniques but readily handled by

Markov mathematics.

4. Complex maintenance options can readily be modelled.

5. Complex Systems: Many simplifying techniques exist which allow the modelling of complex

systems.

6. Sequenced Events: Markov modelling easily handles the computation of the probability of an

event resulting from a sequence of sub-events. This type of problem does not lend itself well to

classical techniques.

LIMITATIONS OF MARKOV CHAIN IN DECISION MAKING

1. Vast increase in number of states as the size of the system increases. The resulting diagrams for

large systems are generally very extensive and complex, difficult to construct and computationally

extensive.

U10EE1095

11

2. Markov modelling of redundant repairable systems with automatic fault detection and one repair

crew is flawed. This is because although random failure is a Markov process, repair of multiple

failures is not a Markov Process. The mathematical discrepancy may be overcome by using a

dedicated repair crew per equipment, but this does not normally correspond to real life support

strategies.

4. Can only deal with constant failure rates and constant repair rates - the latter being unrealistic in

real, operational systems for many reasons including, for example, changing physical conditions

and variations in maintenance skills. However, if the MTTR is very much shorter than the MTTF,

then this shortcoming rarely introduces serious inaccuracy in the final computed system parameters.

5. Future states of the system are independent of all past states except the immediately preceding

one, which implies that a repair returns the system to an “as new” condition.

APPLICATIONS OF MARKOV CHAIN

Physics: Markovian systems appear extensively in thermodynamics and statistical mechanics,

whenever probabilities are used to represent unknown or unmodelled details of the system.

Chemistry: Chemistry is often a place where Markov chains and continuous-time Markov

processes are especially useful because these simple physical systems tend to satisfy the Markov

property quite well.

Speech Recognition: Hidden Markov Models are the basis for most modern automatic speech

recognition systems.

Information sciences : Markov chains are used throughout information processing. They also

allow effective state estimation and pattern recognition.

U10EE1095

12

Internet applications: The PageRank of a webpage as used by Google is defined by a Markov

chain. Markov models have also been used to analyze web navigation behavior of users.

Statistics: Markov chain methods have also become very important for generating sequences of

random numbers to accurately reflect very complicated desired probability distributions, via a

process called Markov chain Monte Carlo (MCMC).

Economics and finance: Markov chains are used in finance and economics to model a variety of

different phenomena, including asset prices and market crashes. Dynamic macroeconomics heavily

uses Markov chains. An example is using Markov chains to exogenously model prices of equity

(stock) in a general equilibrium setting.

Mathematical biology: Markov chains also have many applications in biological modelling,

particularly population processes, which are useful in modelling processes that are (at least)

analogous to biological populations.

Genetics: Markov chains have been used in population genetics in order to describe the change in

gene frequencies in small populations affected by genetic drift.

Games: Markov chains can be used to model many games of chance. The children's game Snakes

and Ladders for example, is represented exactly by Markov chains. At each turn, the player starts in

a given state (on a given square) and from there has fixed odds of moving to certain other states

(squares).

U10EE1095

13

INVENTORY THEORY AND MANAGEMENT

INTRODUCTION

The key decision in manufacturing, retail and some service industry businesses is how much

inventory to keep on hand. Inventory is usually a business’s largest asset. The instant inventory

levels are established, they become an important input to the budgeting system. Inventory decisions

involve a delicate balance between three classes of costs: ordering costs, holding costs, and shortage

costs.

INVENTORY

Inventory theory is the overseeing and controlling of the ordering, storage and use of components

that a company will use in the production of the items it will sell as well as the overseeing and

controlling of quantities of finished products for sale. A business's inventory is one of its major

assets and represents an investment that is tied up until the item is sold or used in the production of

an item that is sold. It also costs money to store, track and insure inventory. Inventories that are

mismanaged can create significant financial problems for a business, whether the mismanagement

results in an inventory glut or an inventory shortage.

Ideologically, Inventory refers to objects that are sold, distributed, or otherwise consumed by a

company. Also, Inventory is defined as the goods and materials that a business holds for the

ultimate purpose of resale (or repair). Inventory management is a science primarily about specifying

the shape and percentage of stocked goods. Inventory is also known as “an itemized list of goods or

valuables, with their estimated worth; specifically, the annual account of stock taken in any

business”

U10EE1095

14

INVENTORY MANAGEMENT

Inventory management has to do with keeping accurate records of finished goods that are ready for

shipment. This often means posting the production of newly completed goods to the inventory

totals as well as subtracting the most recent shipments of finished goods to buyers. When the

company has a return policy in place, there is usually a sub-category contained in the finished

goods inventory to account for any returned goods that are reclassified as refurbished or second

grade quality. Accurately maintaining figures on the finished goods inventory makes it possible to

quickly convey information to sales personnel as to what is available and ready for shipment at any

given time.

Inventory management involves a retailer seeking to acquire and maintain a proper merchandise

assortment while ordering, shipping, handling, and related costs are kept in check. It also involves

systems and processes that identify inventory requirements, set targets, provide replenishment

techniques, report actual and projected inventory status and handle all functions related to the

tracking and management of material.

Balancing the various tasks of inventory management means paying attention to three key aspects

of any inventory. The first aspect has to do with time. In terms of materials acquired for inclusion in

the total inventory, this means understanding how long it takes for a supplier to process an order

and execute a delivery. Inventory management also demands that a solid understanding of how long

it will take for those materials to transfer out of the inventory be established. Knowing these two

important lead times makes it possible to know when to place an order and how many units must be

ordered to keep production running smoothly.

U10EE1095

15

TYPES OF INVENTORY

There are several types of inventory. Raw materials, purchased parts and supplies work-in process,

and component parts are inventories to many businesses. Businesses also need tools, machinery and

equipment as part of their capital inventory. In most businesses however, finished goods are mainly

the consistent inventory, especially for small businesses.

Inventory control involves the procurement, care and disposition of materials. There are three kinds

of inventory that are of concern to managers:

• Raw materials

• In-process or semi-finished goods

• Finished goods.

If a manager effectively controls these three types of inventory, capital can be released that may be

tied up in unnecessary inventory, production control can be improved and can protect against

obsolescence, deterioration and/or theft.

BENEFITS OF INVENTORY THEORY AND MANAGEMENT

1.) Inventory management can help business be more profitable by lowering their cost of goods sold

and/or by increasing sales.

Consider a typical business- Daula Restaurant, Samaru-Zaria with the following income statement:

Sales N 2,000,000

Cost of Goods Sold 1,100,000

Gross Profits 900,000

Gen. Administrative Expenses 402,000

Marketing Expenses 350,000

Net Income before taxes N 148,000

==========

U10EE1095

16

Comment from above: Not bad -- return on sales is over 7%.

Now, suppose that through application of sound inventory management principles, Daula

Restaurant was able to reduce the cost of foods sold by 3%. And because there is less inventory,

let's say that carrying costs (warehouse storage charges, insurance, finance charges, etc) is reduced

by 2% of the general administrative expense. Those minimal cost reductions result in significant

increase on net income:

Sales $ 2,000,000

Cost of Goods Sold 1,067,000

Gross Profits 933,000

Gen. Administrative Expenses 394,000

Marketing Expenses 350,000

Net Income before taxes $ 189,000

===========

Small costs reductions due to application of sound inventory management principles resulted in

very significant increase (28%) in net income!

Lower cost of foods sold is achieved by making the inventory smaller and therefore turn more

often; while making sure that stocks are large enough will result in increased sales because products

are available when Customers call for it. Inventory management is balancing those two opposing

factors for optimum profitability.

3.) Improve Customer Service

4.) Reduce Inventory Investment

U10EE1095

17

5.) Increase Productivity

6.) Prevent Poor Inventory Record Accuracy

Inventory record errors are costly. No computer system, be it old or new, will work properly if

the transactions are not entered correctly. The costs of poor inventory record accuracy are not

always apparent to management. Consider the following results, all of which increase production

costs and reduce profits:

Unanticipated stock-outs

Decreased production efficiency

Higher investment in safety stocks

Requirement for staging of items to determine availability or shortages

Invalid data for inventory replenishment system

More obsolete and excess inventory

Some of these costs can be quantified. Others are intangible, but nevertheless do exist and can be

substantial. It is important to have inventory records, which are accurate. Most experts agree that

this accuracy must be at least 95% and even higher for critical or high unit value items. The key to

accurate records is the implementation of a sound cycle counting system.

U10EE1095

18

INVENTORY MANAGEMENT SYSTEMS

These are tools and or systems use for effective inventory management strategies, They include:

Inventory Tracking Software

The software you use will determine how you track your inventory. Support for real-time vs. batch

updates and the depth of reporting will determine which application is most appropriate for your

business.

Mobile Computer

Inventory tracking requires users to be on the move throughout your facility making updates and

changes. Depending on your environment and need for wireless communication a range of devices

can serve your process.

Wireless Infrastructure

Many businesses need inventory updates to occur in real-time and a wireless network is a required

tool to do so. No matter the size of your facility, a network can be scaled to accommodate it.

Barcode Printer

In order to track items quickly and easily, they need to be labeled with a barcode. Printing can be

done at fixed units for high volume printing or on the move with mobile printers.

Barcode Scanner

Barcode scanner is use to read every barcode type in any environment.

U10EE1095

19

METHODS TO SUPERVISE INVENTORY

The success of a business depends on how well the owner(s)’s ability to maintain adequate

quantities of items sold. Records provided by an inventory control system should call attention to

the need for reorder when necessary or eliminate “dead wood” inventory when called for.

Inventories are controlled and supervised by three (3) methods:

(a) Perpetual Inventory Control-

The perpetual method is the most frequently used method. It is more costly than the other two but it

is an efficient way of keeping count. In this system, complete data records are kept on each item of

merchandise and additions or subtractions are made with each transaction. There is an inventory

balance plus a receipt of sale, minus the actual sale to reflect the quantity at hand.

(b) Actual Counting Piece-

This is another method used to control and supervise inventory. It is used to actually count

inventory item-by-item. This is an exhausting task and not many companies or businesses do it.

Salespeople are usually involve in this process and there is a large margin of error to be considered

as the salespeople go through the monotonous and tiring task of counting everything.

(c) “Looking It Over”- The third method is “Looking over” the inventory. It is the easiest and

cheapest way of controlling and supervising inventory, but there is bound to be errors. With this

method, it is hard to pinpoint the inventory levels, the items that need to be ordered, and the items

that the store is overstocked with. Almost all financial statements that include inventory figures

based on this method cannot be completed accurately.

U10EE1095

20

VISIBLE COSTS OF INVENTORY

As mentioned earlier, inventory decisions involve a delicate balance between three classes of cost.

These costs are:

Ordering Cost- Cost of replenishing Inventory

Carrying Cost- Cost of holding an item in inventory

Shortage Cost- Temporary or permanent loss of sales when demand cannot be met

HIDDEN COSTS OF INVENTORY

Having inventory constantly at hand is good but sometimes there are hidden costs that would prove

to be a menace for businesses. These costs include could cause:

Longer lead times

Reduce responsiveness

Underlying problems are hidden rather than being exposed and solved

Quality problems are not identified immediately

No incentive for improvement of the process

U10EE1095

21

REFERENCES

Hilton, Ronald W., Managerial Accounting. McGraw-Hill,Inc (1994). p.13-14, p. 407,p.218

Galloway R, Principles of Operations Management, Routledge

Hill T, Production/Operations Management, Prentice Hall, 1991

Saunders, Strategic Purchasing and Supply Management, Pitman

Slack N, Chambers S, Harland C, and Johnston R, Operations Management, Pitman, 1995

Behnam (2013). Operations and Production Systems with Multiple Objectives. John Wiley & Sons.

Berger j. O. , Statistical Decision Theory : Foundations, Concepts and Methods, Springer-Verlag,

New York, 1980 .

Kemeny j. And snell j. , Finite Markov Chains , Springer- Verlag , New York , 1976 .

Voskoglou m. G. And perdikaris s. C., A Markov Chain model in Problem-Solving, Int.J. Math.

Educ. Sci. Technol., 22, 909-914, 1991 .

Voskoglou m. G., An application of Markov Chain to the process of modelling, Int. J. Math. Educ.

Sci. Technol., 25, 475-480, 1994 .