A common approach for this kind of problem is DP. What we do here is to choose numbers for the good positions first, then fill the others later on. Let
f(i, j, k) is the number of ways to choose j good positions in the first i-th position and k is a 2-bit number that represents whether number i and number i + 1 is already used or not. The DP transition is quite simple.
F(j) is the number of permutations with j good positions, then
F(j) = Σ(f(n, j, k)) * (n - j)! (because there are n — j numbers left unchosen). However, there are cases we may form some extra good positions when putting these n — j numbers into our permutations, thus
F(j) now stores the number of permutations with at least j good positions. But we want
F(j) to be the number of permutations with exact j good positions, so we must exclude those permutations with more than j good positions.
Let's do it in descending order. First
F(n) must be the number of permutations with exact n good positions. Hence, we may calculate
F(n - 1) based on
F(n - 2) based on
F(n - 1) and
F(n), and so on... The last part is to calculate how many times
F(j) is repeatedly counted in
F(i) (i < j), it's simply
C(j, i) (the number of ways to choose i good positions from j good positions).
The overall complexity is O(n^2).
Please refer to my submission 3376355 for an implementation.