A curve has the equation ``for `x>=b` ` ` ,where a and b are constants.It intersects the x-axis at the point A with x=b+1.Show that the gradient of the curve at A is 1. Thanks!

2 Answers | Add Yours

ishpiro's profile pic

ishpiro | College Teacher | (Level 1) Educator

Posted on

` ` First, use the fact that the curve intersects x-axis at the point A with x = b+ 1 to find a relationship between a and b:

The equation of the curve is `y = (x-a)(x-b)^(1/2) = (x-a)sqrt(x-b)`

At the point where it intersects x-axis, x = b+ 1 and y = 0. So,

`0 = (b+1-a)sqrt(b+1 - b) = (b+1-a)sqrt(1) `

From here we see that b + 1 - a has to be 0, so b+ 1 - a = 0 and a = b+1.

Now let's calculate the gradient, or the slope of the tangent line, at point A.

This is the derivative of y(x) at point A. We can use the product rule to find the derivative:

`y'(x) = sqrt(x- b) + (x-a)/(2sqrt(x-b))`

At point A, x = b + 1 = a, so we can plug in a for x everywhere to determine the slope at point A:

`y'(a) = sqrt(a-b) + (a-a)/(2sqrt(a - b))`

Since a = b+ 1, then a - b = 1, so

`y'(a) = sqrt(1) + 0 = 1` .

Therefore, we have shown that the gradient of the curve at point A is 1.

We’ve answered 318,928 questions. We can answer yours, too.

Ask a question