iPhone 18: Samsung to manufacture the image sensors for Apple

Apple is advertising chips from Samsung's US factory in upcoming products. Apparently it is a stacked image sensor.

listen Print view
Back of two iPhones side by side
2 min. read

Apple has ominously announced "an innovative new chip manufacturing technology" that "has never been used before in the world." In the future, Samsung's chip manufacturing division in Austin, Texas, will produce these chips for Apple. They are intended to "optimize the performance and energy efficiency of Apple products, including iPhone devices shipped worldwide." The announcement is part of Apple's plan to invest 600 billion US dollars in the USA.

Apple is not specifying which manufacturing technology or chip type is involved. On first reading, this could sound like a state-of-the-art manufacturing process, for example, with extreme ultraviolet exposure technology including high numerical aperture (High-NA EUV). However, according to consistent reports, including from the Financial Times, this is not about processors, but about image sensors for the iPhone 18 generation, among others. The sensors are therefore not yet ready for the 17 series.

"New technology" does not refer to particularly fine chip structures, but to the sensor design with multiple chip levels. It is fitting that the chips are to be produced in Samsung's older Austin semiconductor plant. Samsung plans to produce new manufacturing processes in neighboring Taylor, where a new plant is being built. However, image sensors do not require particularly fine structures.

Videos by heise

Image sensors can be stacked in various ways. Apple's previous supplier Sony, for example, relies on up to three sensor layers. In the three-layer image sensors, the photodiodes for capturing the light rays are located on the top layer. Below this is the second layer with the pixel transistors for storing the charge. The third layer contains the logic circuits for converting the transistor charges into image information. Sony exposes all three layers on different silicon wafers.

Another approach is to divide the necessary red, green and blue pixels into different layers to significantly increase the light yield. In previous image sensors, the corresponding photodiodes sit next to each other so that only a third of the incident light is captured. Optimizing the light yield would explain Apple's promise of improved energy efficiency.

Empfohlener redaktioneller Inhalt

Mit Ihrer Zustimmung wird hier ein externer Preisvergleich (heise Preisvergleich) geladen.

Ich bin damit einverstanden, dass mir externe Inhalte angezeigt werden. Damit können personenbezogene Daten an Drittplattformen (heise Preisvergleich) übermittelt werden. Mehr dazu in unserer Datenschutzerklärung.

(mma)

Don't miss any news – follow us on Facebook, LinkedIn or Mastodon.

This article was originally published in German. It was translated with technical assistance and editorially reviewed before publication.