I am trying to understand how to choose a graphics card based on a

**price to performance**value for each card.

I have a number of price/performance values for a several graphics cards and I plotted them. I naturally thought that it may be best to find a linear regression line for this data set and figure out their price to performance values from the delta between the linear regression line and the datapoint.

But then I thought, I may be overcomplicating this. I can simply divide the performance by the price and look where performance is high and price is low.

Am I thinking along the right lines? The way I understand linear regressions are used to estimate or predict the value of an dependent (price) variable based on the value of an independent (performance) variable, and although this may be useful to me in the future this is not necessary for the analysis I am doing now to find the price performance leaders among a dataset.

Thanks for considering this question and dealing with my noobiness.