I have a number of events that can happen, say e1, e2...eN. N isn't particularly large. Each event has a probability of "failure", p1, p2...pN (Actually, there will likely be an overall failure rate, that will be converted to a weighted probability based on some properties of the eN -- namely their size -- but that's probably not that important).

You can assume, the probabilities are independent. Failing at one point doesn't impact the failure at another point (again, this will be modeled in the overall failure rate.)

Upon a "failure", you have to start from the beginning, and incur a cost, c1, c2...cN, for failing on e1, e2...eN etc.

The entire process has a fixed cost for completion, C.

So let's say you have only two events, e1 and e2, and the you didn't fail on the first even, but did on the second (and then didn't fail on the first and second after the retry, obviously), your cost wold be c2 + C

If you failed on the third event, and then after retry, again failed on the first, your cost would be c3 + c1 + C.

And so on...

My questions is this: What would be the closed form solution for the expectation on the cost of this system, for a large number of trials?

I thought maybe it was E=p1(c1+E)+p2(c2+E)+⋯+pN(cN+E), but that doesn't seem right. Can anyone give me some pointers?

(I mentioned the craps thing because it was the closed analogue I could think of. Sort of how in this instance, you have to keep going until you reach the "end" without failing, and in craps it's sort of the reverse: you keep going until you fail. Okay, maybe it's not all that related).

Thanks!