Selecting a laser for Raman spectroscopy – important performance parameters to consider

Numerous different wavelengths of light are commonly used in Raman spectroscopy, ranging from the ultraviolet (UV) through the visible, and into the near-infrared (near-IR). Choosing the best illumination wavelength for a given application is not always obvious. Many system variables must be considered to optimize a Raman spectroscopy experiment, and several of them are connected to the wavelength selection.

Why is the choice of laser wavelength important for Raman Spectroscopy?

To start with, the Raman signal is inherently very weak. It relies on the photon-phonon interaction in the sample material, which is typically a one-in-a-million event. In addition, the Raman scattering intensity is inversely proportional to the 4th order of the illumination wavelength, which means that illumination at longer wavelengths results in a decreased Raman signal.

The detector sensitivity is also dependent on the wavelength range. CCD‘s are commonly used for detection of the Raman signal. The quantum efficiency of these CCD devices rolls off fairly quickly beyond 800 nm. For illumination beyond 800 nm, it is possible to use InGaAs array devices, but those are associated with higher noise levels, lower sensitivity and higher cost. The wavelength dependence of the Raman signal strength and the detection sensitivity all seem to point towards the use of shorter wavelength illumination (UV and visible) as opposed to longer wavelengths (in the near-IR). However, there is still a challenge to overcome with shorter wavelength illumination: Fluorescence emission. Many materials emit fluorescence when excited with UV-visible light, which can swamp the weak Raman signal. Even so, the most commonly used wavelength in Raman spectroscopy is 785 nm. It offers t