Not Really Statistical Problem More Like Algebra

#1
When I am in my car driving I tend to try and do mental gymnastics. Looking at my GPS:
It’s noon, destination is 100 miles away, speed limit is 50MPH. The GPS computes arrival at 2pm (2 hours).
So, assuming one instantly goes 60mph, how many minutes will it be until the ETA drops by 1 minute?
I figure 50 mph= .83 miles/minute and 60mph= 1 mile/minute or a delta of .17 miles/minute. So each minute driving at 60 I “pick up” .17 miles.

So it takes 1/.17 = 5.88 minutes to pick up a mile.
But what I am trying to do is compute for time, ie how many minutes at 60 (10 mph over the speed limit) does it take to reduce ETA by 1 minute.

Can you help with my mental block? THANKS