Introspective, detail-oriented and disaster-chasing AIs – TechCrunch 1

Introspective, detail-oriented and disaster-chasing AIs – TechCrunch

Research papers are coming far too often for anyone to read them all. This is especially true in the area of ​​machine learning, which affects virtually every industry and company today (and where papers are made). This column aims to collect some of the most relevant recent discoveries and publications – especially regarding, but not limited to, Artificial Intelligence – and explain why they are important.

It takes emotionally mature AI to admit its own mistakes, and that is exactly what this Technical University of Munich project aims to achieve. Perhaps not exactly the emotion, but rather recognizing and learning from mistakes, especially in self-driving cars. The explorers propose a system in which the car would look at all those times in the past when it had to surrender control to a human driver and thereby learn its own limits – what they call “introspective failure prediction”.

For example, with many cars ahead, the autonomous vehicle’s brain could use its sensors and logic to make a decision de novo about whether or not an approach would work. However, the TUM team says that by simply comparing new situations with old ones, it can make a decision much faster about whether it needs to resolve itself. Saving six or seven seconds could make all the difference to a safe handover.

It is important that robots and autonomous vehicles of all kinds can make decisions without having to call home, especially in combat, where decisive and precise movements are required. The Army Research Lab is looking for ways in which ground and aircraft vehicles can interact autonomously and, for example, enable a mobile landing pad on which drones can land without having to coordinate, ask for permission or rely on precise GPS signals.

Your solution, at least for testing purposes, is actually rather low-tech. The ground vehicle has a landing pad at the top with a huge QR code that the drone can see from a distance. The drone can track the exact position of the pad completely independently. In the future, the QR code could be abolished and the drone could instead identify the shape of the vehicle, presumably using best guess logic to determine if it is the one you want.

Illustration shows how a KI tracks cells through a microscope.

Credit: Nagoya City University

In the medical world, AI is used to avoid working on tasks that are not very difficult but rather tedious for people. A good example of this is tracking the activity of individual cells in microscopic images. It is not a superhuman task to look at a few hundred images stretched across several depths of a petri dish and track the movements of cells, but that doesn’t mean that students like to do it.

This software from Researchers at Nagoya City University In Japan this is done automatically with the help of image analysis and the ability (greatly improved in recent years) to understand objects over a period of time and not just in individual images. Read the paper here, and see the extremely cute illustration that shows the technology on the right. More research organizations should hire professional artists.

This process is similar to tracking birthmarks and other skin features in people who are at risk of melanoma. While they may see a dermatologist every year to see if a particular point seems sketchy, the rest of the time they need to track their own moles and freckles in a different way. It’s hard when they’re in places like your back.

Source link

Similar Posts