I have a day of data, broken down into hourly segments. For each hour, I have a measure of cost, and a measure of conversions. This in turn gives me the cost-per-conversion by hour.

I can group the hours of the day into a maximum of six groups, and apply a % adjustment in order to drive the most conversions from a fixed budget. There is a limitation of available volume in any given hour which I factor into group specific calculations also.

I take the day's overall cost-per-conversion as requiring a 0% adjustment (as it's the average).

I figure that I need to have some sort of iterative comparison of overall conversions volume, based on small amendments to the groupings and %s, to iterate towards the solution which provides a maximum.

Would it be accurate to take all possible groupings, and compute the % using the data within the group vs the daily average, and then simply compare the conversion volume for each permutation to determine the optimal permutation?

Is there an easier alternative?

My problem is determining which methods suit the problem - although I certainly welcome mathematics assistance, I'm happy to work on this myself, but if its possible to get your thoughts on relevant methods it would be much appreciated.

Attached is an image of sample data just for clarification.