given a table of values for a function, f(x), explain how to determine if f(x) is linear or non linear from the average rate of change.
Linear functions are functions whose average rate of change is constant. (The constant rate of change is called the slope.)
Given a table of values, choose any two points and compute the averate rate of change: `m=(y_2-y_1)/(x_2-x_1)=(Delta y)/(Delta x)` . Then do the same for any other pair of points. If the function is linear the rate of change will be the same regardless of the pairs of points you pick.
** If the x's are seperated by the same amount, you can use the method of finite differences. See link **