Given an array of integers arr (of size n) and an integer w, find maximum number in all subarrays of arr of length w.
Imagine that n is very large and a sliding window of a smaller size w is moving through arr from left to right. We need to find the maximum in every position of the sliding window.
arr = [1, 3, -1, -3, 5, 3, 6, 7]
w = 3
[3, 3, 5, 5, 6, 7]
Size of arr is 8 and so the size of the output array is n-w+1 = 8-3+1 = 6.
Here are all the 6 positions of the sliding window and the corresponding maximum values:
1) [1 3 -1] -3 5 3 6 7. Maximum is 3.
2) 1 [3 -1 -3] 5 3 6 7. Maximum is 3.
3) 1 3 [-1 -3 5] 3 6 7. Maximum is 5.
4) 1 3 -1 [-3 5 3] 6 7. Maximum is 5.
5) 1 3 -1 -3 [5 3 6] 7. Maximum is 6.
6) 1 3 -1 -3 5 [3 6 7]. Maximum is 7.
Input Parameters: Function has two arguments: arr and w.
Output: Function must return an array of integers of length n-w+1. i-th value in the returned array must be the maximum among arr[i], arr[i+1], ..., arr[i+w-1].
● -2 * 10^9
In a brute force solution we could identify all the n-w+1 "windows" of size w and find maximum in each one by looking through all its elements. Time complexity of such a solution would be O((n-w)*w). We did not provide a sample implementation of this brute force algorithm.
First time we would look for the maximum number among arr[0, w-1]. Second time - among arr[1, w], and so on. arr[1, w-1] is the common part of the first and second "windows". Can we avoid repeating that computation and improve the time complexity?
Using a priority queue can get us a better solution. Such algorithm would keep exactly w elements at a time in a priority queue, it would add one element and remove one element each time the "sliding window" moves to the right by one element. Priority queue implemented with a heap or a self-balancing binary search tree (such as red-black tree or B-tree) has the following time complexities for operations that we care about:
* O(1) for "get maximum" operation,
* O(log(w)) for "add element" operation,
* O(log(w)) for "remove element" operation.
A solution to our problem based on using a priority queue can therefore get us time complexity of O(n*log(w)), much better than the brute force.
We did not provide a sample implementation of this algorithm either.
We provided one sample solution which has even better time complexity, O(n). It uses data structure called deque which can be implemented as a doubly linked list (or you can use your favorite language's standard library implementation). Elements can be added or removed from either end of a deque in constant time, optimal_solution takes advantage of that property.
We don't necessarily need to keep _all_ the w elements (and sort them), as we move the "sliding window" through arr. Some elements would not be "useful" as they would never end up in the output as the maximum element and so we can discard them (and do that in constant time per element, see below).
Consider the following example: arr = [3, -1, -3, 5, 3], w=3.
At some point our deque has first three elements from arr, (3, -1, -3), in this order, and we are looking to append the next one, 5.
First, we can drop 3 from the deque because it doesn't fit in the current sliding window anymore (only -1, -3 and 5 fit). To make this check efficiently (in constant time) the deque doesn't store the actual elements of arr but their indices, see the code for more details on this.
Next, we start from the end of the deque and notice that because -3 < 5, -3 can never be the maximum number in any remaining sliding window. Indeed, all the remaining "sliding windows" that include -3 will also include 5. So 5 (or a greater number if found later) would be the maximum, never -3. So we discard -3 from the deque.
We now need to check -1 in the same manner (we do that while the deque isn't empty and the last element in the deque < 5). Similarly we find that -1 cannot be the maximum element in any "sliding windows" remaining for it because all those "sliding windows" will also include 5. We discard -1 from the deque, too.
The deque is now empty and we can finally append 5 to its end.
Having appended an element to the deque, we now get the maximum element of the current "sliding window" from the beginning of the deque. (here, of course, that's 5)
Moving the "sliding window" forward by one element, 3 is now the current element. We again want to check if there is an out-of-window element in the deque that need to be removed (there is none, the only element 5 is within w elements from the current element).
Now we want to append current element to the deque. But first, just like we did with element 5, we want to check if we need to remove any "useless" elements from the deque (those that cannot anymore - after appending 3 - become the maximum and end up in the output). Because 5 is greater than 3, 5 is not useless. We also need to keep 3 in the deque: at some point 5 will be out of the "sliding window" before 3 will be out, at that point 3 can become the maximum.
So we keep 5 in the deque and append 3 to the end.
Beginning of the deque now once again is guaranteed to have the maximum in current "sliding window".
This is how the optimal_solution works.
Reading optimal_solution code you may notice a "while" loop inside the "for" loop that handles all n elements. Don't let that confuse you. Key observation for understanding time complexity of optimal_solution is that it appends each element of arr to the deque exactly once and removes each element from the deque at most once. Appending and removing is done in constant time; all the related checks are also done in constant time.
The deque can grow as long as w elements in the worst case.
Input size is O(n), auxiliary space used is O(w), output size is O(n-w). Summing up all three and taking into account that w