# Error weight in 2 point linear regression

#### noob3000

##### New Member
Hello everyone,

I have following problem: I do measurements of the formation of a product at two time points (t=0 and t=15), each in triplicates. Afterwards I calculate the mean value + the standart deviation of each time points. Now I do a linear regression (time on x axis) to obtain the slope, which is the output I want to know. So far so good, but since I have standart deviations at both y-values, I also need a standard deviation of the slope and here I am a little confused
Would it be correct to just do regressions of the maximum and minimal slope, due to the error bars, possible? Does it makes sense to imply error weighting even if I have just to points of measurements?
Usually I plot with Qti plot and use the "instrumental weighting" of the errors, wich seems to use this formula: