I have two sets of columns, the first M1 to M12 giving forecast figures, followed by an overall Total, then a second set giving M1 to M12 actuals (see attached). I need to be able to provide a variance in column AA for each row which is the difference between the Forecast YTD figure and the Actual YTD figure up to the current month. The current month will be defined by the column before the column where the sum of a column = 0, rather than by using the actual date.
So it needs to check the sum of each column in the M1 to M12 actuals until it finds the first column that equals 0 (note it is possible for there to be zero values in cells within a column, i.e. a zero actual) and then SUM the actuals between M1 and the Month where the SUM of the column <>0, (M3 in the attached) and then subtract the equivalent YTD total for the same months in the M1 to M12 forecast to give the difference in column AA on a row by row basis.
So in the example the first column in the M1 to M12 actuals to = 0 is column Q, so for each row it needs to SUM(N2:P2) - SUM(A2:C2) = -1,763
Likewise if the total of column R >0 it would give te YTD variance for the first 4 months and so on.
I've been playing around with OFFSET, MATCH and INDEX but still can't figure it out.
Many thanks
Bookmarks