Maarten Vanden Eynde

Silicon Age (2016), Meessen De Clercq, Brussels, Belgium, 2016 (photo: Philippe de Gobert)

Silicon Age (2016), Meessen De Clercq, Brussels, Belgium, 2016 (photo: Philippe de Gobert)

Ever since the digital revolution began, microchips made of silicon have been getting smaller. ‘Moore’s Law’, based on a forecast made by Intel founder Gordon E. Moore in 1965, predicted that the number of transistors that can be fitted onto a microchip would double every 18 to 24 months, constantly increasing computer speed and efficiency. By the start of the twenty-first century the traditional chip circuitry made of silicon had become too microscopic to work reliably. It marked the end of the silicon age.

Over 90% of Earth’s crust is composed of silicate minerals. That makes silicon the second most abundant element in the Earth’s crust, after oxygen. It is most widely distributed in dusts, sands, planetoids and planets as various forms of silicon dioxide (silica) or silicates. Silicon is the basic material used in the production of integrated circuits, which in turn are used in computers, televisions, mobile phones and every kind of electronic equipment and semiconductor device. Mono-crystalline silicon is also used in large quantities by the photovoltaic industry in the production of conventional solar cells.

'Silicon Age' consists of a pure silicon ingot or boule, using a special process to obtain 99.99999% pure single crystals. On one side the image of the first monolithic silicon integrated circuit chip, invented by Robert Noyce in 1961, is engraved as a bas-relief. On the other side, the crystal comes to a natural end, in the physical form of the ingot, at the point where it cannot get any smaller