The solution by Daniel Saad is on the right track, but unfortunately isn't complete.
It's not difficult to come up with an example when the greedy algorithm fails and hence the item should be better excluded even if it satisfies the condition when it's processed:
Let $A=(1, 0, 2, 2)$ and $k=1$. The answer is $(1, 2, 2)$ and one would make a mistake by including $0$ into a subsequence, because it will shut the door for both $2$s. Obviously, the skip sequence can be arbitrarily long, e.g. $A=(1, 0, ..., 0, 2, 2, ..., 2)$.
Moreover, an algorithm that tries to complete any previous subsequence is also wrong, because the principle of optimality isn't satisfied. Consider this example:
Let $A=(0, 1, -1, 0, 2, 2, 2, 2)$ and $k=2$. The optimal sequence up to $A[4]$ is $(0, 1, -1, 0)$ (contains all items). But the answer up to $A[5]$
is not completing the best sequences up to $A[4]$ or up to $A[2]$!
The best subsequence actually includes $A[4]=0$ and excludes $A[3]=-1$.
The algorithm I came up fills the table $T[i, p, q]$, which designates the best subsequence that ends with item $i$, the minimum of which is $A[p]$ and the maximum is $A[q]$ ($p \leq i$ and $q \leq i$).
This means that on each step $i$, the list of candidates is going to hold up to $i^2$ values, for each $(A[p], A[q])$ pair from $A[1..i]$. The total time is $O(n^4)$!
The base equation is trivial. The transition on step $i+1$ computes the list of candidates by completing all candidates from the previous steps $T[j]$, $j \leq i$. The total number of candidates is $O(n^3)$. I leave you the task of writing formal recursive equations as an exercise.