This problem asks to solve an equation *xa* + *yb* + *zc* = *n*. We can enumerate each *x* and *y* but calculate *z* directly according to *zc* = *n* - *xa* - *yb*, which gives an algorithm with complexity *O*(*n*^{2}).

The key idea is to enumerate all the feasible center points. For each center point with coordinate (*x*, *y*), it contributes *min*(*x*, *w* - *x*) × *min*(*y*, *h* - *y*) to the final result.

We adopt two pointers *p*1, *p*2, to point to the position of array *a* and array *b*, respectively. Before moving on to the general algorithm, we start with a simple example to obtain some intuitive understanding.

At first, we try to put the value *a*[*i*] = *b*[0] to position 0. If *a*[0] = *b*[0], we do nothing but try to put the value *a*[*i*] = *b*[1] to position 1. We continue this until the first *a*[*i*]! = *b*[*i*] is met, and without loss of generality, we assume that *a*[0]! = *b*[0]. To accomplish this, we should take out all the elements in array *a*[*n*] with indices larger than or equal to *i* (remember to count the number of operations), and then we can put *a*[*i*] = *b*[0] to position 0. Well, what should we do with the other elements that have been taken out. In fact, they “have been” put to correct positions. Although our algorithm can not tell their correct positions right now, as we may take out more elements from the end and thus move their positions by some offset, we still have full “observation” and we can completely determine their current correct positions by considering the “future”.

Therefore, we have the following general algorithm. At first *p*1 = *p*2 = 0, and we check whether *a*[*p*1] = *b*[*p*2]. If yes, we increase both *p*1 and *p*2 by one. Otherwise we check whether the element *a*[*i*] = *b*[*p*2] has been taken out or not. If yes, we increase only *p*2 by one; otherwise we take out more elements until *a*[*i*] = *b*[*p*2] is met, and then increase *p*2 by one. We count the number of operations during this process.

For each type of car, we adopt floyd algorithm to calculate the minimum time it costs to go from city *i* to city *j*, which is denoted as *t*[*i*][*j*]. Then, we use *dp*[*i*][*j*][*k*] to denote the minimum time it costs to go from city *i* to city *j* with no more than *k* changes. The recursive formula is *dp*[*i*][*j*][*k*] = *min*_{j' ≠ j}(*dp*[*i*][*j*'][*k* - 1] + *t*[*j*'][*j*]). Note that it is sufficient to calculate *k* up to *n*, since we have at most *n* cities, and the optimal path must be a simple path with no loops. Thus, remember to update *k* = *min*(*k*, *n*) for the originally given *k*.

The main idea is to use binary search to find the optimal *q*, since if there exists some *q*' that satisfies the requirements, then any *q* ≥ *q*' must be a feasible answer as well.

For any *q* that we are testing, we should check whether we can reach the ending point *t* from the starting point *s* or not. If yes, we further decrease *q*; otherwise we try to increase it and test the next value again. For each test, we can implement SPFA (shortest path faster algorithm) to compute the shortest path distance from any point *i* to *s*, denoted as *dis*[*i*], with complexity *O*(*E*) where *E* is the number of edges. However, we should modify the calculation slightly when we meet a special “volunteer point”. Once we have reached one of such points with index *j*, we should set *dis*[*j*] = 0, since the distance is “cleared” at such points. Finally, if *dis*[*t*] ≤ *q*, then the current *q* satisfies the requirements and we should further decrease it; otherwise we increase it for the next test.