Akihabara News, Electronista report that Sony announced 12, 8 and 5MP sensors and camera modules featuring the new 1.4um pixels. As far as I was able to understand the original Sony Japan announcement, translated by Google, 1/2.5-inch IMX060PQ 12.25MP and 1/4-inch IMX045PQ 5.15MP sensors start sampling in March 2009, while 1/3.2-inch IMX046PQ 8.11MP samples are available in November 2008, that is now. IU060F 12.25MP camera module sampling begins in September 2009, while IU046F 8.11MP module sampling starts in February 2009. The modules have piezoelectric AF, F2.8 lens and 28mm and 32mm equivalent focal length for 12MP and 8MP models respectively.
Regarding the technology, Sony only tells about Cu wiring and column-parallel ADCs used in the sensors. 12MP sensor speed is 10fps at full resolution, while 5MP and 8MP ones have 15fps full resolution speed. All sensors have 1080p/30fps and 720p/30fps HD video modes (except 12MP one having only 27fps in 1080p mode and 5MP one having 60fps in 720p mode). The new 1.4um pixel sensitivity is said to be equal to one of 1.75um pixel.
The sensor's interface is 648MHz 2-lane MIPI and sub-LVDS. Supply voltage is a triple combination of 2.7V for analog, 1.2V for digital and 1.8V for interface (1.8V is probably needed to adopt sub-LVDS voltage levels).
Red color reproduction is one of the most difficult problems in small pixel design. Probably because of that Sony put a magnified inset with red color details in its sample picture from its new 12MP sensor (click on picture to enlarge):
Update: It looks like Sony Insider site has quite a complete translation of Sony Japan PR on 1.4um generation. It also has a translated table with sensor parameters:
Update #2: Here is the official Sony PR in English.
Do you know if this is the BSI sensor that that they announced a while back? If so then that explains the red response and QE increase (through the depth of the Si)
ReplyDeleteThis is not BSI. Red is improved probably by lower metal stack, may be lightguide, optimized antireflective layers and some such.
ReplyDeleteLower stack, "light guide", and antireflective layers do not really improve red QE from optics point of view. Only a better implant isolation allows it. Or a specific post-processing to get "pure" red colors...
ReplyDeleteI afraid most pixel designers would disagree with you. Probably your experience is different from other's. Most companies see stack reduction as the major way to overcome diffraction in red. AR layers reduces reflection, to an extent. True, isolation implants are very effective in crosstalk reduction, but most companies use them since 1.75um generation or even before.
ReplyDeleteOf course most pixels designer can disagree with me! But recent papers have demonstrated that optimized antireflective coating improves responses from blue to green wavelengths but not so much on red wavelength. It is the same with stack reduction which substantially improves blue and green responses. From my point of view, red QE is today an implant pb.
ReplyDeleteInteresting. Could you point me to the recent papers you mentioned?
ReplyDeleteSure: "Three-dimensional broadband FDTD optical simulations of CMOS image sensor" (SPIE Conference Proceedings Paper)
ReplyDeleteAuthor(s): A. Crocherie, J. Vaillant, F. Hirigoyen
Date: 27 September 2008
Vol: 7100
Well, this contradicts with my common sense. I will look into your article and get back then.
ReplyDeleteAny comment is wellcome.
ReplyDeleteF.H.
OK, I've read the paper. First, I'm impressed by a serious and a large-scale work done by ST optical team.
ReplyDeleteHowever, the paper is not well organized, probably because it shows a glimpse of a larger work that has many targets, and because the stack and pixel details were censored.
I agree that AR layer probably does not affect the red. I see no mention of light pipe effectiveness in the paper.
One thing that I missed to see in the simulation setup is color filter-caused diffraction. Red light coming to the sensor sees a set of red color apertures, while other colors block it. Same for other colors. I was unable to see it defined in the simulation setup. More than that, a part of simulation to silicon comparison uses transparent layer instead of CF. I believe that having color filter aperture in simulation is essential.
The paper talks about 1.75um pixel, where diffraction effects are smaller than in 1.4um. Also, 1.4um pixel starts to exhibit higher order diffraction effects, so an array of pixels should be simulated to see it.
But overall it's a great job. Especially impressive is the comparison with the real silicon, it rarely fits that close to the simulation. Excellent work!
Thank you! Well, more information about pixel and stacks were presented at the conference. The general methodology has been previously presented (simulation set-up): "FDTD-based optical simulations methodology for CMOS image sensors pixels architecture and process optimization" (EI2008 Conference Proceedings Paper). I can assure you we are still improving our modeling, and we are more and more predictive. New papers will soon come!
ReplyDeleteFH