Verify the given linear approximation at `x = 0`. Then determine the values of x for which the linear approximation is accurate to within 0.1.
Enter your answer using interval notation. Round answers to three decimal places.
`ln(1 + x) ~~ x`
This question is a little vague. It's difficult to simply "verify" a given linear approximation without some parameters. Let's assume that it means that at `x = 0`, the logarithm and the approximation are within 0.1 of each other.
To start, let's show that this linear approximation is within the given interval at `x = 0`. To do this, we must evaluate both functions at `x = 0` and see if they are within 0.1 of each other.
`x = 0`
`ln(x+1) = ln(1) = 0`
Both are equal at x = 0. Therefore, the approximation holds at `x = 0`.
Now, to find the interval on which this approximation has an error of less than 0.1, we must solve the following equation:
`|ln(x+1)-x| < 0.1`
This inequality means the linear function must satisfy both of the following conditions:
`ln(x+1)-x < 0.1`
`ln(x+1)-x > -0.1`
Subtract 0.1 from both sides of both equations to yield the following results:
`ln(x+1) - x - 0.1 < 0`
`ln(x+1)-x+0.1 > 0`
Now, we can use the root-finding function on your calculator to determine the boundaries of x given by the above conditions. Let's graph the first function and find where it is zero:
We can see, clearly, that the first inequality (blue) is always true. However, the second inequality (red) is only true on a certain interval. Using the root-finding function on your calculator, you can find that this interval is bounded by the points `(-0.379, 0)` and `(0.516,0)`.
Therefore, we can say the linear approximation holds to within 0.1 on the following open interval:
In other words, `-0.379<x<0.516`.
I hope this helps!