Ashok Lahiri, Enovix co-founder and CTO, authored an article, “A New and Innovative 3D Architecture for Lithium Batteries”, that appears in the latest issue of Applied Wireless Technology (you can read the complete article beginning on page 12 of the issue). Following is an excerpt from the article, and I’ll post a few more over the coming weeks.

A modern compact, mobile device, such as the smartphone shown below, is only feasible because of performance advancements in integrated circuits, LED lighting, and LCD video displays made possible by photolithography and wafer production

The Technology Path to Modern Mobile Devices

A modern compact, mobile device—such as a smartphone, tablet, or smartwatch—is only feasible because of performance advancements in integrated circuits, LED lighting, and LCD video displays. Each of these critical components was transformed from its predecessor—vacuum tube, incandescent light bulb, and cathode ray tube, respectively—through photolithography and wafer production techniques.

The first programmable electronic digital computer, Colossus, used in 1943 to help break the German Lorenz cipher, employed vacuum-tube technology. Vacuum-tube computers were very large and required a temperature-controlled environment, due to massive heat generation. Vacuum tubes had a very short mean time to failure, which, on average, required replacement of a failed computer tube every couple of days.

Development of a solid-state transistor to replace the vacuum tube began in 1945. Photolithography was first used on silicon wafers to produce transistors in 1955. The technique was used to produce integrated circuits (ICs), which contain thousands or millions of transistors, for commercialization in 1960. Photolithography and wafer production have helped drive an exponential increase in the transistor density of ICs and a corresponding decrease in cost. This has enabled a rapid advancement in computing capability and smaller and smaller platforms from the first mainframe computers, through desktops and laptops, to smartphones, tablets, and wearable devices.

Development of an incandescent light bulb dates from around 1802. Commercialization of incandescent bulbs began around 1880. By 1964, improvements in efficiency and production of incandescent lamps had reduced the cost of providing a given quantity of light by a factor of thirty, and bulbs were used to light homes, offices, and streets throughout the developed world. However, 95% or more of the power consumed by a typical incandescent bulb is converted into heat, rather than visible light, making it relatively inefficient.

The first visible light-emitting diode (LED) was invented in 1962—first in red, then in green and yellow. Initially, LEDs were very costly, but photolithography and wafer production drove costs down, making them a popular choice for many low-power displays in the 1980s. In the 1990s, blue and white LEDs were invented.

As efficiency increased and production costs continued to decrease, LEDs began to replace incandescent bulbs in a range of signal display applications, including traffic lights and auto brake lights. By 2012, a 10W LED emitted light equivalent to a 60W incandescent bulb, and total cost of ownership over the life of the LED was about 80% less than a bulb. Today, LED lighting is rapidly replacing incandescent bulbs for many light intensive applications, including flashlight and flash photography applications in mobile devices.

The first cathode ray tube (CRT) scanning device was invented by Karl Ferdinand Braun in 1897. The CRT display was essential to the first electronic television (TV), which was produced in the 1930s. Commercialization began in the 1940s, and large-scale adoption of TVs with CRT displays occurred after World War II.

A liquid crystal display (LCD) is made using photolithography and wafer production processes. LCDs were initially commercialized in the 1970s for pocket calculators and digital watches. Monochrome LCDs enabled the production of early portable computers in the 1980s. Throughout the 1980s and 1990s, LCD technology continued to improve as the displays gained more contrast, better viewing angles, and advanced color capabilities. By the late 1990s, quality had improved and prices had declined to a point where 19- and 21-inch color LCD displays began replacing CRT monitors for desktop computing applications, especially for graphic design and desktop publishing. Today, ultrahigh-resolution LCD displays are essential to smartphones, tablets, and smartwatches.

Vacuum tubes, incandescent bulbs and cathode-ray tubes were all initially produced using complex, industrial processes—not unlike today’s lithium-ion (Li-ion) battery. And each eventually became a performance impediment for its primary application. Modern photolithography and wafer production techniques were used to manufacture each replacement device—IC, LED, and LCD display, respectively. In fact, the only major component in a modern mobile device that is not produced using photolithography and wafer production techniques is the Li-ion battery.