Infrared imaging is a method of introscopy(*) which makes it possible to visualize objects invisible to the naked eye with the aid of infrared rays reflected or emitted by such objects. This is a common technique used in studying various substances, in flaw detection (e.g. in the construction industry) and in controlling all kinds of technological processes.
The first infrared imagers appeared in 1941. By the present-day standards, they were quite primitive in design and possibilities. It was only in 1975 that it became possible to make direct precursors of the present-day apparatuses (with photodetectors as the centerpiece) on the basis of semiconductor materials synthesized by French scientists. Even though much has been done to improve such semiconductors ever since, they no longer conform to the present standards of infrared imaging.
Our research scientists from the Joint Institute of Semiconductor Physics (RAS Siberian Branch, Novosibirsk) have come up with an adequate technology. They have designed a new material and equipment for growing CMT (cadmium/ mercury/tellurium) films on gallium arsenide substrates by the method of molecular- beam epitaxy(*) which experts assess as a real breakthrough. Our scientists began this research back in 1979. And the first practical results were already on hand in 1986 when Angara and Katun units for the production of superfine films were put into operation. Assisting the "birth" of this innovative technology were likewise experts from other arms of the Siberian Branch of the Russian Academy of Sciences-namely from the G. Budker Institute of Nuclear Physics, the Institute of Applied Microelectronics and its pilot plant.
At first-about fifteen years ago or so-the use of molecular-beam epitaxy for making CMT films was being questioned, for the growth rates were then only at 1 um/h. For photodetectors the material should be 10-12 urn thick, however. So, growing a film of desired parameters took about 24 hours (preliminary oper ...
Читать далее