You are given an integer array nums with length n. The cost of a subarray nums[l..r], where 0 <= l <= r < n, is defined as:
cost(l, r) = nums[l] - nums[l + 1] + nums[l + 2] - ... + nums[r] * (-1)^(r - l)
Your task is to split nums into subarrays such that the total cost of the subarrays is maximized, ensuring each element belongs to exactly one subarray.
Formally, if nums is split into k subarrays at indices i₁, i₂, ..., i_{k-1}, where 0 <= i₁ < i₂ < ... < i_{k-1} < n - 1, then the total cost will be:
cost(0, i₁) + cost(i₁ + 1, i₂) + ... + cost(i_{k-1} + 1, n - 1)
Return an integer denoting the maximum total cost of the subarrays after splitting the array optimally.
Note: If nums is not split into subarrays, i.e. k = 1, the total cost is simply cost(0, n - 1).
Input & Output
Constraints
- 1 ≤ nums.length ≤ 105
- -106 ≤ nums[i] ≤ 106