With a battle cry of “More Light. Less Noise,” New startup Pelican Imaging has announced its focus on developing a smaller, better camera for smartphones, notebooks, tablets, and other mobile devices that would theoretically be able to take higher quality photos (and in lower light) than currently possible by using an image array sensor.

What the heck is an image array sensor? Think of the compound eye of a common housefly. When viewed close up, you can see that the housefly’s eye consists of multiple sensors relaying a vast assortment of visual data that is then sent to the fly’s tiny brain and compiled into a single, stitched image that makes sense of what’s being seen. An image array sensor is like a fancy little robot version of this way of seeing the world. Since this differs significantly from the human eye model common to most existing cameras, image array sensors promise a different — Pelican would say better — capacity for converting visual data into images in a thinner, more compact package. With the powerful computer processing power present in even the most spartan of modern smartphones at their disposal, there’s no reason for cameras attached to them to mimic traditional means of capturing images.

Image Array Sensor Claims Future Of Thinner, Better Mobile Device Cameras

Pelican Imaging claims that a prototype array camera exists, and press from the company tells us:

“Pelican’s camera improves upon image and video quality while allowing for thinner smartphones. New applications are also enabled by introducing features such as 3-D depth, gesture control, and the ability for users to interact with the image before and after capturing the shot.”

Some find Pelican’s vision to be slightly blurred and unrealistic, though. Online marketing consultant Brandon Wirtz chimes in: “For the same kinds of reason that two cameras shooting 3D gives you headaches, this isn’t likely to happen. Why do we like images that don’t have infinite depth of field, or movies shot in auto focus? Why do movies look better when the camera moves closer to the subject rather than zooming?

“A fly or a spider may have “better” vision than a human because it can see lots of things and put images together using stitching, but people don’t take pictures for ‘better,’ they take them to remember the event in the way it was perceived. For this reason a camera that mimics the human eye will always be what people prefer.”

Whatever future textbooks will tell us about the history of photography in the early 21st century (and the success or failure of image array sensors in mobile devices), there’s no denying that this is an exciting time for consumer technology. For now, we’ll just have to wait and see — but how we’ll see is anyone’s guess.