# A light source contains two wavelength of light, λ_1=450 nm and λ_2=650 nm. It is incident on a diffraction grating with 1000 slits/mm and the interference pattern is being observed on a screen...

A light source contains two wavelength of light, λ_1=450 nm and λ_2=650 nm. It is incident on a diffraction grating with 1000 slits/mm and the interference pattern is being observed on a screen that is 1.5 m away from the slits. What is the separation between the 1st order bright fringes of the two wavelengths of light?

*print*Print*list*Cite

### 1 Answer

The figure is below. The condition of having maxima of interference at the angle `theta` for a certain wavelength `lambda` is

`d*sin(theta_k) = k*lambda` (1) , where `k` is an integer

where `d =1/n (mm)` is the distance between two slits on the grating

For a screen situated at the distance `l` from diffraction grating, the maxima occur at height `D_k` on the screen. From the figure one can write

`tan(theta_k) =D_k/l` (2)

For small angles `theta_k` one can write

`tan(theta_k) = sin(theta_k)`

and thus from (1) and (2) we get

`d*D_k/l =k*lambda => D_k= (l*k*lambda)/d`

For `lambda =450 nm =450*10^-6 mm` and `k=1` we obtain

`D_1 =1.5*1*450*10^-6*1000=0.675 mm`

For `lambda' =650 nm= 650*10^-6 mm` and `k=1` we obtain

`D_1' =1.5*650*10^-6*1000 =0.975 mm`

Therefore the separation on the screen between first order maxima is

`Delta(D) =0.975-0.675 =0.3 mm`

**The separation between the first order bright fringes on the screen is 0.3 mm**

**Sources:**