2024年8月17日土曜日

Mimicking the human eye, the new AMI-EV camera is capable of tens of thousands of frames per second and has low power consumption. It was developed by a U.S. research team.

 




https://wired.jp/article/camera-inspired-by-human-eyes/

https://wired.jp/article/camera-inspi



Skip to main content
Wired Japan

    Business
    Culture
    Gear
    Mobility
    Science
    Well-Being
    Opinion
    SZ Membership

    Series of Cameras

Inspired by the way the human eye focuses on a moving object, researchers in the United States have developed a camera that can capture images hundreds of times more accurately than conventional cameras. The camera has the potential to revolutionize robotics and many other technologies that rely on visual information from cameras.


Photograph: Tetra Images/Getty Images

The “event camera” technology was inspired by the retina of living organisms. This camera is capable of detecting changes in brightness and outputting data as an event consisting of coordinates, polarity, and time. While these characteristics make it excellent for tracking moving objects, they also make it difficult to capture clear images as the number of movements increases. This drawback can be a life-or-death problem for robots and autonomous vehicles that respond to changing conditions based on visual information from the camera.

To overcome this challenge, a research team at the University of Maryland has devised a new camera mechanism that mimics the mechanics of the human eye. We wondered, “How do humans and animals follow moving objects with their eyes? We thought there might be a clue there,” explains Hu Bo-tao, a doctoral student in computer science.

Hu and his team developed the Artificial Microsaccade Enhanced Event Camera (AMI-EV), which can capture objects clearly and without blurring, by applying the principle of the human eye's high-speed, leap-like movement to fix a point of view. This new mechanism is expected to potentially revolutionize many technologies that rely on visual information from cameras.
Artificial Reproduction of Involuntary Eye Movements

Microsaccades are a type of involuntary eye movements that occur when a person gazes at an object. When a person fixes his or her gaze on an object, the eye movements, called fixational micro-movements, are called microsaccades or flicks, which are high-speed, jump-like movements. By repeating these micro movements, the human eye maintains focus on visual information such as color, depth, and shading.

WATCH.

Game Sound Effect Craftsmen Devote Their Whole Lives to Their Creation | Obsessed

In this study, researchers have succeeded in artificially reproducing a microsaccade by incorporating a rotating prism mechanism inside the camera. By reflecting light rays entering through the lens onto the continuously rotating prism, the camera reproduces natural human eye movements, enabling the camera to capture a stable image of the subject.
Advertisement

According to the research team, the AMI-EV was able to fully demonstrate its accuracy in capturing moving objects in a variety of situations, including detecting a human pulse and recognizing the shape of a fast-moving object during the initial testing phase. The company claims that while commonly available cameras can only capture images at an average frame rate of 30 fps to 1,000 fps, the AMI-EV can capture smooth images at a frequency of tens of thousands of frames per second.

This invention has the potential to dramatically advance visual technology in robotics. Just as humans perceive the world through visual information, robots perceive the world through computer processing of information from cameras. The performance of the camera is directly related to the accuracy of the robot's recognition and response,” said Iannis Aloimonos, a professor at the University of Maryland.

Technologies that enable more precise imaging and shape detection may also revolutionize fields other than robotics. For example, there are a wide range of areas where camera performance will play an important role, such as the immersive experience in the development of augmented reality (AR) technology, the reliability of security through surveillance cameras, and the ability to take astrophotographs through a telescope.

One of the areas that the research team is focusing on as one of the areas where the advantages of AMI-EV can be maximized is in the development of smart wearable devices, which require the ability to process user movement information quickly and accurately. According to researcher Cornelia Vermeuler, the AMI-EV has a better ability to capture images in extreme lighting conditions than conventional cameras, with low latency and low power consumption. This performance makes it ideal for virtual reality applications that seek a seamless experience.

In many fields, improving camera performance is a never-ending challenge. Aloimonos believes that the AMI-EV will not only be the key to solving existing problems, but will also pave the way for the development of more advanced systems.

(Edited by Daisuke Takimoto)

For more information on the camera, please see the WIRED article.
Related Articles
article image
The next generation of “mechanical eyes”? A New Approach to “Cameras without Lenses” from Japan
Demand for cameras is increasing due to the spread of monitoring and automation technologies. A research team led by Professor Masahiro Yamaguchi of the Tokyo Institute of Technology has recently published a paper on a new approach to “lens-less cameras,” which has the potential to revolutionize the “mechanical eyes” used all around us.
Smartphone
A technology for completely flat lenses is set to revolutionize smartphone cameras.
Metalenz
Lenses that can utilize “polarization information” will make smartphone cameras smarter.
A US startup has developed a lens for smartphones that can capture the direction of light reflected from objects. By analyzing images containing polarization information, it is expected to be useful for inspecting lesions on the skin and improving the performance of in-car cameras in autonomous vehicles.
Camera that Mimics the Mechanism of the Human Eye, Developed by U.S. Research Team

WIRED magazine, Japanese edition, VOL.53
Spatial × Computing” is now on sale!

Technology that seamlessly integrates real space and digital information to create an “experience space” where information can be interactively controlled. Or a trigger that calls for a paradigm shift from two dimensions (2D) to three dimensions (3D) in all kinds of creativity. Or all the possibilities that can be opened up when a “computer” intervenes between humans and space. This is the “frame” of “spatial computing” that WIRED Japan is considering. As information and experiences expand from “screen (2D)” to “space (3D)” (i.e., “generation of new media”), what kind of transformations will individuals and society face in the future? We will make an all-out effort to explore the possibilities of this process! For more information, please visit
https://wired.jp/magazine/vol_53/

Translated with DeepL.com (free version)

0 コメント:

コメントを投稿