Determine how much data is required by determining when graph levels out

#1
I have a bunch of graphs for data that I have collected in my last experiment. The graphs are usually shaped similar to either exponential curves or upside down exponential curves with noise and they level out over time. I was wondering how I can determine the amount of data that needs to be collected before the graph levels out. I.e. I am looking for the amount of data that should be collected in the future, so that I get a good estimation of the final levelled out value while collecting as little data as possible.

I believe I need to do an analysis that checks if adding more data results in a significant change in the value; however, I am currently not sure what test I can use to determine this.

Thanks!
 
#2
Can i use regression over time for this problem?

I have a bunch of graphs for data that I have collected in my last experiment. The graphs are usually shaped similar to either exponential curves or upside down exponential curves with noise and they level out over time. I was wondering how I can determine the amount of data that needs to be collected before the graph levels out. I.e. I am looking for the amount of data that should be collected in the future, so that I get a good estimation of the final levelled out value while collecting as little data as possible.

I believe I need to do an analysis that checks if adding more data results in a significant change in the value; however, I am currently not sure what test I can use to determine this.

There are only 15 data points over time for each instance. The levelling out appears to occur around 8 data points for most instances.
 

hlsmith

Less is more. Stay pure. Stay poor.
#4
In particular, look up limit of a sequence or limit of a function depending on your scenario. It may also be educational to all if post your results back up here.
 
#5
Hi hlsmith,

Thanks for your comment. I looked into convergence and the limit, but I think this still doesn't solve my problem.

I want to figure out the x-value where more data (more x's) no longer significantly changes the value of y.

The limit will just tell me what final value of y is being approached.
 

noetsi

Fortran must die
#7
It has been too many years since I did calculus, but the answer I think is at the point when the slope of the curve is zero. There is I believe software that will calculate this or you can do it by eying the data (although I assume that you are not interested in the latter).
 

hlsmith

Less is more. Stay pure. Stay poor.
#11
Are your graphs just line connected scatterplots? I have a couple of ideas, but I don't fully understand your data and procedures. Might also help to just layout the scenario and possibly post a graph.
 

noetsi

Fortran must die
#12
As I read back through this I am confused about what you really want to know here. The amount of data you collect is normally tied to issues like statistical power and generalizability. It is not tied to issue such as when a graph will level out, nor will a leveling out (actually change in direction of the curve I think) address power or generalizability. When a graph, a function I assume, leveled out would be determined by the nature of the function. If you have the data already, and were confident nothing would change in the future, you should be abl to estimate this already.

But, again, I don't think it will tell you what you seem to be asking.
 

Miner

TS Contributor
#13
I don't know of a specific test, but am thinking along the lines of taking moving ranges. As a curve is increasing/decreasing, the moving ranges will be of the same sign (+ or -). Once the curve flattens out, the signs should randomly fluctuate. Of course, it will take more measurements before this becomes statistically significant than may be of practical importance.