Error weight in 2 point linear regression

#1
Hello everyone,

I have following problem: I do measurements of the formation of a product at two time points (t=0 and t=15), each in triplicates. Afterwards I calculate the mean value + the standart deviation of each time points. Now I do a linear regression (time on x axis) to obtain the slope, which is the output I want to know. So far so good, but since I have standart deviations at both y-values, I also need a standard deviation of the slope and here I am a little confused:confused:o_O
Would it be correct to just do regressions of the maximum and minimal slope, due to the error bars, possible? Does it makes sense to imply error weighting even if I have just to points of measurements?
Usually I plot with Qti plot and use the "instrumental weighting" of the errors, wich seems to use this formula:

1567769491228.png
(https://chemie.uni-paderborn.de/fil...PC/2018_Instrumentelles-Praktikum_Teil-PC.pdf)

s would be the standart deviation of the slope, w the weighting of the standard deviaitons of the y values(1/std(y)^2) and x the time points (0 and 15).
Since one of my x values is 0, it would be ignored partly in this formula, so I am confused, if I can still apply it.

I hope could make my porblem understandable, for any help I would be very thankful!