AI |
BLOCKCHAIN |
BUSINESS |
Translating the language of behavior through AI
05/10/2024
20mins
Prasobh V Nair
BUSINESS,AI,BLOCKCHAIN
Do you know that Hollywood stars often wear motion capture suits that essentially happen to be full body costumes that have various sensors attached to them? By the use of these sensors, they get transformed into Hulk, or monsters via the use of computer software,
Now, Princeton professors Joshua Shaevitz and Mala Murthy have collaborated their labs in order to take this technology one step ahead. Essentially, they will be making use of the latest artificial intelligence advances in order to track the individual body parts of animals in an existing video.
In this regard, they have developed a tool called LEAP. The prime job of LEAP is to track the individual body parts of animals throughout millions of frames with a great level of accuracy. In order to make this possible, the system can be trained in just a couple of minutes. Once done, it tracks the body parts without the need of any labels or markers.
According to Murthy, this model shows great prospects for the future and has broad applications throughout animal model systems. Atop that, it can also be useful in the analysis of the behavior of animals that have undergone drug treatments or those that carry genetic mutations.
The paper is set to be published this month, in January 2019 in the journal Nature Methods. This paper details the new technology in-depth. The open access version of the software has already been released, though. In fact, it has already been employed by a number of laboratories.
The combination of LEAP with other tools that are developed in the laboratories, essentially allows the researchers to study the language of behavior. This involves a study of the patterns in the body movements of animals.
According to a PNI graduate and the first author on the paper, Talmo Pareira, LEAP is an extremely flexible tool that can be used on a wide array of video data. The prime functioning of this tool involves labeling certain points in several videos. The rest of the work is done by the neural network itself. Atop that, LEAP makes use of an interface that is easy enough for those who have no prior knowledge of programming.
In its demonstration, Pareira showed a motion tagged video of a large giraffe in order to demonstrate its use on large mammals. This is because its capabilities were already tested on the smaller animals such as mice and flies. However, the software worked equally well on a large giraffe.
The video of the walking giraffe was basically taken from the Mpala station, and specific points were labeled over 30 video frames. It took the team less than an hour to do so. In the rest of the video, LEAP was able to track the motion on its own.
Previously, a lot of efforts were put into the development of artificial intelligence tools that can track the motion of humans. These tools, however, relied on the large sets of data that were manually annotated. In the case of LEAP, however, similar methods are optimized to work by using data that is collected on the scale of a laboratory.
Published:05/10/2024
Share: