# Suppose that students arrive at a lecturer’s office according to a Poisson process with rate ¡, and each conversation between a lecturer and a student is exponentially distributed with rate t....

Suppose that students arrive at a lecturer’s office according to a Poisson process with rate ¡, and each conversation between a lecturer and a student is exponentially distributed with rate t.

After each conversation is finished, there is a probability p that the student’s problem is solved and he or she consequently leaves.

With probability 1-p, however, the student suddenly remembers another question, and rejoins the queue immediately to talk to the lecturer again.

(a) Write down an appropriate state space S for a continuous-time Markov chain model of the number of students in the queue, and the transition matrix Q for the continuous-time Markov chain.

(b) Let fj be the probability that the queue ever empties, given that it starts with j students. Write down a set of equations and boundary conditions satisfied by the fj .

(c) Give an explanation for your boundary conditions and explain what other conditions we need to distinguish which solution to these equations gives us the fj?

(d) Solve these equations and give the criterion for this continuous-time Markov chain to be recurrent.

**Markov chains**

*print*Print*list*Cite

### 1 Answer

In order to make headway with a complex probability problem like this stochastic process Markov chain problem, you must be certain you grasp the **definitions** of the important terms. To facilitate that understanding, we'll go over these definitions here.

**Stochastic Process**

The Poisson Process, named for its developer, French mathematician Siméon Denis Poisson, is a **stochastic process**.

**Stochastic Process:** A process that is a variation in time (t) of a state of a given set or system of events. Event sets can vary over time, and this variation in time comprises a stochastic process.

"** Stochastic**" is

**defined**as a system of events that incorporates randomness instead of determined occurrences.

**Poisson Process**

A **Poisson Process** is a counting and **queuing process** theory that accounts for random (*or stochastic*) events. It counts queuing arrivals in a queue system and assumes probability of arrival in the queue is dependent upon the size of the time (t) interval. It assumes that queuing arrivals have no dependence upon or correlation with the history of arrivals up to that time: If queueing arrivals depended upon the preceding history of arrivals, then there would not be perfect randomness; there would be imperfect stochastic process involved in arrivals.

The **Poisson Process**, a continuous-time counting process, is shown by the formula: {N(t), t ≥ 0}.

**Markov Chain**

A **Markov chain**, named for it developer A. A. Markov, is a chance, random process in which one event ** can **affect the outcome of a subsequent event (this is opposite of the

*Poisson Process*wherein events

*affect subsequent events). With a Markov chain, the history of events up to that time can affect the outcome of the stochastic (random) process.*

**cannot**A Markov chain is **described** as a set of states in which a process starts in one of the states (which state starts doesn't relate to outcome) then moves from one to the next state in successions called steps. The set (S) of states is expressed as S* = f s 1;s 2 ;:::;s r g*. Successive states are expressed as s*i*, s*j* and so on. The probability of a given state and its successive state occurring is expressed, then, as p*ij* (for states s*i* and s*j*).

Additionally, the process can remain static, staying in the state it is already in. The probability of a static state, not moving to a successive state, is expressed p*ii*.

**Sources:**