No. You're making it too hard on yourself. What does difference mean?Part 1 asks: to calculate the difference in the average commercials between network and cable tv. I believe they are asking for the mean?
This graph will aide our discussion.Part 2 asks: Suppose the standard deviation in the amount of commercials for both the network and cable tv was either 5 minutes or 15 seconds. Which standard deviation would lead you to conclude that there was a major difference in the two commercials averages?
When a mean is +/- 2 standard deviations above or below the mean we say the two a significantly different. The standard deviation is the variation of the data away from the mean. So 5 minutes is a greater variation or spread than is 15 seconds. So a graph (bell shaped normal curve in the link I provided) of a 5 minute sd would be a lot wider than the graph of 15 seconds. Now you see how the 2 graphs (normal curves) are on top of each other? The green line represents 2sd's above the mean. 95% of the population falls below this green mark. Now if we have a mean (the high point of the second graph) that is beyond the 2 sd mark (green line) of the first population distribution (first normal curve) we can say there is a significant difference between the two. That means the difference is unlikely to have occurred due to chance because the difference (raw difference between means taking into account the standard deviation (or spread of the people away from the mean) is large enough that we can generalize this to the larger population.
I didn't give you answers but the information to begin thinking through the problem.