Friday, September 19, 2014

Chipworks iPhone 6 Plus Teardown Finds Sony Sensors in Front and Rear Cameras

Chipworks is quick to publish reverse engineering pictures of iPhone 6 and 6 plus: "The iPhone 6 Plus iSight camera chip is housed in a camera module measuring 10.6 mm x 9.3 mm x 5.6 mm thick. Fabricated by Sony, the iSight camera chip is a stacked (Exmor RS), back-illuminated CMOS image sensor (CIS) featuring 1.5 µm generation pixels (introduced for the iPhone 5s). The die size is 4.8 mm x 6.1 mm (29.3 mm2). The phase pixel pairs have all been implemented in the green channel and cover the majority of the active pixel array."

"Our speculation of Sony winning the FaceTime sockets, though, turned out to be correct. We’ve just confirmed the iPhone 6 Plus FaceTime camera is a stacked Sony CIS and will provide more details in a future update."

Chipworks publishes few pictures of iPhone 6 Plus rear camera:

Sony Announces SmartEyeglass

Sony releases a development prototype of SmartEyeglass, a Google Glass competitor. The SmartEyeglass is equipped with a 3MP camera, capable of VGA video. A high capacity battery is located in a separate controller box connected to the glasses by a wire:


Sony plans to offer SmartEyeglass for sale to developers by the end of FY2014 (March 2015), with the intention of further promoting the development of applications and accelerating the commercialization of the product for consumer use. A Youtube video shows the prototype glasses in action:

Vision Award Shortlist

IMV Europe publishes Stuttgart, Germany Vision Show 2014 award shortlist. The only image sensor company in the list is Odos Imaging, with its high resolution ToF cameras. Real.iZ-1K (1.3MP) is the first system, released in 2014, while the higher resolution Real.iZ-4K (4.2MP) is to be released in 2015. Each and every pixel can be used to measure both ambient light and range allowing the systems to generate separate images of the scene in both range and intensity modes. The cameras including all the features of a conventional machine vision camera, with the additional benefit of individual pixel range measurements.

Thursday, September 18, 2014

Image Sensors at IEDM 2014

IEDM publishes its 2014 agenda with sessions 4 and 10 having many image sensor papers:

4.1 MOS Capacitor Deep Trench Isolation for CMOS Image Sensors
N. Ahmed, F. Roy, G-N. Lu*, B. Mamdy, J-P. Carrere, A. Tournier, N. Virollet, C. Perrot, M. Rivoire, A. Seignard**, D. Pellissier-Tanon, F. Leverd and B. Orlando, STMicroelectronics, *CNRS, **CEA-LETI

This paper proposes the integration of MOS Capacitor Deep Trench Isolation (CDTI) as a solution to boost image sensors’ pixels performances. We have investigated CDTI and compared it to oxide-filled Deep Trench Isolation (DTI) configurations, on silicon samples, with a fabrication based on TCAD simulations. The experiment measurements evaluated on CDTI without Sidewall Implantation exhibit very low dark current (~1aA at60°C for a 1.4μm pixel), high full-well capacity (~12000e-), and it shows quantum efficiency improvement compared to DTI configuration.

4.2 Three-Dimensional Integrated CMOS Image Sensors with Pixel-Parallel A/D Converters Fabricated by Direct Bonding of SOI Layers
M. Goto, K. Hagiwara, Y. Iguchi, H. Ohtake, T. Saraya*, M. Kobayashi*, E. Higurashi*, H. Toshiyoshi* and T. Hiramoto*, NHK Science and Technology Research Laboratories, *The University of Tokyo

We report the first demonstration of three-dimensional integrated CMOS image sensors with pixel-parallel A/D converters. Photodiode and inverter layers were directly bonded to provide each pixel with in-pixel A/D conversion. The developed sensor successfully captured images and confirmed excellent linearity with a wide dynamic range of more than 80 dB.

4.3 High Sensitivity Image Sensor Overlaid with Thin-Film Crystalline-Selenium-based Heterojunction Photodiode
S. Imura, K. Kikuchi, K. Miyakawa, H. Ohtake, M. Kubota, T. Okino*, Y. Hirose*, Y. Kato* and N. Teranishi**, NHK Science and Technology Research Laboratories, *Panasonic Corporation, **University of Hyogo

We developed a stacked image sensor on the basis of thin-film crystalline-selenium (c-Se) heterojunction photodiode. Tellurium-diffused crystallization of producing uniform c-Se films was used to fabricate c-Se-based photodiodes laminated on complementary metal-oxide-semiconductor (CMOS) circuits, and we present herein the first high-resolution images obtained with such devices.

4.4 9.74-THz Electronic Far-Infrared Detection Using Schottky Barrier Diodes in CMOS
Z. Ahmad, A. Lisauskas*, H.G. Roskos* and K.K. O, University of Texas at Dallas, *JWG University

9.74-THz fundamental electronic detection for Far-Infrared (FIR) radiation is demonstrated. The detection along with that at 4.92 THz was realized using Schottky-barrier diode detection structures formed without any process modifications in CMOS. Peak optical responsivity (Rv) of 383 and ~14V/W at 4.92 and 9.74THz have been measured. The Rv at 9.74THz is 14X of that for the previously reported highest frequency electronic detection. The shot noise limited NEP at 4.92 and 9.74THz is ~0.43 and ~2nW/√Hz.

4.5 Experimental Demonstration of a Stacked SOI Multiband Charged-Coupled Device
C.-E. Chang, J. Segal*, A. Roodman*, C. Kenney* and R. Howe, Stanford University, *SLAC National Accelerator Laboratory

Multiband light absorption and charge extraction in a stacked SOI multiband CCD are experimentally demonstrated for the first time. This proof of concept is a key step in the realization of the technology which promises multiple-fold efficiency improvements in color imaging over current filter- and prism-based approaches.

4.6 Enhanced Time Delay Integration Imaging using Embedded CCD in CMOS Technology
P. De Moor, J. Robbelein, L. Haspeslagh, P. Boulenc, A. Ercan, K. Minoglou, A. Lauwers, K. De Munck and M. Rosmeulen, IMEC

Imec developed a new imager platform enabling the monolithic integration of 130 nm CMOS/CIS with charge coupled devices (CCD). The process module was successfully developed and the potential of this embedded CCD in CMOS (eCCD) was demonstrated with the fabrication of a time delay integration (TDI) imager.

10.1 Jot Devices and the Quanta Image Sensor (Invited)
J. Ma, D. Hondongwa and E. Fossum, Thayer School of Engineering at Dartmouth

The Quanta Image Sensor (QIS) concept and recent work on its associated jot device are discussed. A bipolar jot and a pump gate jot are described. Both have been modelled in TCAD. The pump gate jot features a full well of 200 e- and conversion gain exceeding 300 uV/e-.

10.2 SPAD Based Image Sensors
E. Charbon, Senior Member IEEE

The recent availability of miniaturized photoncounting pixels in standard CMOS processes has paved the way to the introduction of photon counting in low-cost time-of-flight cameras, robotics vision, mobile phones, and consumer electronics. In this paper we describe the technology at the core of this revolution: single-photon avalanche diodes (SPADs) and the architectures enabling SPAD based image sensors. We discuss tradeoffs and design trends, often referring to specific sensor chips and applications.

10.3 Toward 1Gfps: Evolution of Ultra-high-speed Image Sensors: ISIS, BSI, Multi-Collection Gates, and 3D-stacking
T.G. Etoh, V.T.S. Dao, K. Shimonomura, E. Charbon, C. Zhang*, Y. Kamakura and T. Matsuoka**, Ritsumeikan University, *Technical University of Delft, **Osaka University

Evolution of ultra-high-speed image sensors toward 1 Giga fps is presented with innovative technology to achieve the frame rate. The current highest frame rate is 16.7Mfps. A new sensor structure and a new driver circuit are proposed. Simulations prove that they further reduce the frame interval to 1ns.

10.4 Imaging with Organic and Hybrid Photodetectors (Invited)
S. Tedde, P. Buechele, R. Fischer, F. Steinbacher, O. Schmidt, Siemens AG

10.5 A CMOS-compatible, Integrated Approach to Hyper- and Multispectral Imaging
A. Lambrechts, P. Gonzalez, B. Geelen, P. Soussan, K. Tack and M. Jayapala, Imec

Imec has developed a process for the monolithic integration of optical filters on top of the CMOS imager sensors, leading to compact, cost-efficient and faster hyperspectral cameras with improved performance. To demonstrate the versatility of imec hyperspectral technology, prototype sensors with different filter arrangements and performance have been successfully fabricated.

10.6 Image Sensors for High-throughput, Massively-parallel DNA Sequencing: Requirements and Roadmap
A. Grot, Pacific Biosciences

The cost of DNA sequencing has dropped significantly over the last decade, due in part to advances in high performance CCD and CMOS image sensors. Key performance specifications – such as resolution, sensitivity, and frame-rate, along with the performance improvements necessary for continued cost reduction – will be discussed.

10.7 High Performance Silicon Imaging Arrays for . . . - looks like incomplete title in the agenda.

10.8 Detecting elementary particles using Hybrid Pixel Detectors at the LHC and beyond
M. Campbell, CERN

On July 4th 2012 CERN announced the discovery of the Higgs Boson at the Large Hadron Collider. Englert and Higgs were awarded the Noble Prize for Physics in 2013 for postulating the existence of the boson along with Brout (now deceased) in 1964. The discovery was made possible by the combination of a machine capable of accelerating protons to unprecedented energies, and two huge detectors, called Atlas and CMS, able of record unambiguously the energy and location of the particle tracks produced by the collisions. Every 50ns bunches of protons are made to collide in the heart of the giant experiments and around 20-30 proton interactions take place generating thousands of debris particles. In searching for the Higgs boson, the particles participating in a given interaction need to be detected and tagged to a given bunch crossover (BCO). The innermost regions of the experiments are equipped with hybrid pixel detectors. This paper will provide a brief overview of the large scale hybrid pixel detector systems used at the LHC experiments. It will also describe how the same hybrid pixel detector approach is used in applications beyond high energy particle physics.

26.3 High Performance Metal Oxide TFT and its Applications for Thin Film Electronics
G. Yu, C.-L. Shieh, J. Musolf, F. Foong, T. Xiao, G. Wang, K. Ottosson, CBRITE Inc.

Recent progress on metal-oxide TFT with mobility and stability as good as LTPS-TFT and with uniformity and off current as good as pristine a-Si TFT will be presented. Their applications for high pixel density displays and image arrays are discussed with emphasis on pixel and peripheral circuits with analog functions.

The conference press kit shows a preview of NHK paper #4.2 3D "Pixel-Parallel" Image Processing:


"The resolutions and frame rates of CMOS image sensors have increased greatly to meet demands for higher-definition video systems, but their design may soon be obsolete. That’s because photodetectors and signal processors lie in the same plane, on the substrate, and many pixels must time-share a signal processor. That makes it difficult to improve signal processing speed. NHK researchers developed a 3D parallel-processing architecture they call “pixel-parallel” processing, where each pixel has its own signal processor. Photodetectors and signal processors are built in different vertically stacked layers. The signal from each pixel is vertically transferred and processed in individual stacks. 3D stacking doesn’t degrade spatial resolution, so both high resolution and a high frame rate are achieved. 3D stacked image sensors have been reported previously, but they either didn’t have a signal processor in each stack or they used TSV/microbump technology, reducing resolution. NHK will discuss how photodiode and inverter layers were bonded with damascened gold electrodes to provide each pixel with analog-to-digital conversion and a pulse frequency output. A 64-pixel prototype sensor was built, which successfully captured video images and had a wide dynamic range of >80 dB, with the potential to be increased to >100 dB."

Hua Capital Hires Bank of America to Fund Omnivision Bid

Bloomberg: Hua Capital Management Ltd., a Beijing-based private equity firm, hired Bank of America Corp. to provide funding for its $1.7b bid for Omnivision. Steven Zhang, president of Hua Capital, declined to comment on specifics of the deal, including how much funding Bank of America will provide.

Hua Capital was chosen in June to manage the chip design and testing fund under the Beijing government’s 30 billion-yuan ($4.9b) Semiconductor Industry Development Fund. The Semiconductor Industry Development Fund was set up in December last year to help finance China’s chip industry growth and assist with mergers and acquisitions.

Mantis Vision and Flextronics Present Tablet with 3D Camera

Cnet, NY1, Tom's Guide: Mantis Vision and Flextronics announce their collaboration and development of the OEM-ready 3D-enabled tablet specifically designed for Dynamic 3D Content Creation, called Aquila. Aquila is an 8” tablet featuring Mantis Vision’s MV4D core 3D engine, MV4D Camera Control SDK, and depth sensing components for 3D data acquisition.

At Mantis Vision, we are ecstatic to be such an integral part of Aquila,” said Amihai Loven, CEO, Mantis Vision. “Aquila will be the first tool of its kind for content creators and a variety of commercial and vertical market applications. Because it is available to all developers and OEMs, makers will have unbounded access to a brave new 3D content ecosystem. Along with Flextronics, we are ready to reinvent the 3D experience for everyone, from creators to consumers.

The tablet is aimed to developers who work on 3D imaging applications.

Wednesday, September 17, 2014

MIPI Alliance Officially Releases C-PHY v1.0, D-PHY v1.2, and M-PHY v3.1 Specs

Business Wire: MIPI Alliance introduces the new C-PHY spec, a physical layer interface for camera and display applications. "The MIPI C-PHY specification was developed to reduce the interface signaling rate to enable a wide range of high-performance and cost-optimized applications, such as very low-cost, low-resolution image sensors; sensors offering up to 60 megapixels; and even 4K display panels," said Rick Wietfeldt, chair of the MIPI Alliance Technical Steering Group.

MIPI C-PHY departs from the conventional differential signaling on two-wire lanes and introduces 3-phase symbol encoding of about 2.28 bits per symbol to transmit data symbols on 3-wire lanes, or “trios” where each trio includes an embedded clock. Three trios operating at the C-PHY v1.0 rate of 2.5 Gsym/s achieve a peak bandwidth of 2.5 Gsym/s times 2.28 bits/symbol, or about 17.1 Gbps over a 9-wire interface that can be shared, if desired, with the MIPI D-PHY interface.

The MIPI Alliance also announces updates to the MIPI D-PHY and MIPI M-PHY physical layer technologies. The updated MIPI D-PHY specification, v1.2, introduces lane-based data skew control in the receiver to achieve a peak transmission rate of 2.5 Gbps/lane or 10 Gbps over 4 lanes, compared to the v1.1 peak transmission rate of 1.5 Gbps/lane or 6 Gbps over 4 lanes. The MIPI M-PHY v3.1 specification introduces transmitter equalization to improve support for challenging channels while maintaining the peak transmission rate of 5.8 Gbps/lane or 23.2 Gbps over 4 lanes, which was achieved in its v3.0 specification.

Synopsys MIPI D-PHY Cuts Area and Power by 50%

PR Newswire: Synopsys says its new DesignWare MIPI D-PHY is 50% lower in area and power compared to competitive solutions. The new IP is the first in the industry compliant to the MIPI D-PHY v1.2 spec (8 data lanes maximum instead of 4 lanes in v1.1), and delivers aggregated data throughput of up to 20 Gbps for high-resolution imaging (2.5 Gbps per lane, 8 lanes).

The new DesignWare MIPI D-PHY is available now in 16-nm FinFET processes, with availability in 28-nm processes scheduled for early 2015. VIP for MIPI D-PHY v1.2 is available now.

"By delivering an extremely small-area and low-power D-PHY to the fast-paced and competitive mobile market, Synopsys helps designers differentiate their SoCs in both silicon cost and battery life," said John Koeter, VP of marketing for IP and prototyping at Synopsys.

"The DesignWare MIPI D-PHY offered low power consumption, high performance and configurability options that were critical to the success of our Myriad 2 Vision Processing Unit," said Sean Mitchell, SVP and COO at Movidius. As a side note, Myriad 2 has 12 lanes of 1.5 Gbps D-PHY in 28nm process, so I'm not sure it's relevant to this recent announcement.

Samsung APS-C BSI Sensor

Business Wire, Samsung Tomorrow: Samsung officially announces its 28MP APS-C sensor. Currently in mass production, the new S5KVB2 uses 65nm copper process, while most of the other large sensors rely on 0.18um aluminum techbnology. The 65nm process enables lower power consumption, less heating and lower noise.

The pixel size is 3.6um. Samsung says that BSI technology "improves the light sensitivity of each pixel and increases light absorption in peripheral areas by approximately 30 percent, resulting in crisper, sharper images compared to a conventional front-side illumination (FSI) pixel-based imager."

To satisfy the increasing market need for high-end image sensors in digital cameras, Samsung has introduced this new imager, which features excellent higher resolution, superior image quality, and faster shooting speed with low power consumption,” said Kyushik Hong, VP of System LSI marketing, Samsung Electronics. “Based on its leadership in CMOS imaging technologies, Samsung will continue to address new trends in camera sensor markets.

Tuesday, September 16, 2014

Sony Presents 4D AF

Sony presents 4D Autofocus that the company calls "the beginning of new era." Yooutube videos show its advantages form a user perspective (video #1, video #2: