A new method in examining test subject behavior has been developed by the Tufts Center for Regenerative and Developmental Biology partnering with Wireless Techniques. Given that manual methods performed by human researchers can be inaccurate, expensive, and time-consuming, the new automated learning and testing chamber can instead analyze the behavior of small animals on a 24/7 basis, with several experiments running at the same time. As a result, greater insight will be achieved in the area of learning and memory.
Using human researchers to observe test subject behavior during research experiments can be both time consuming and expensive. The available manpower for real-time observation is limited, and human observations are inherently subjective. To address some of these limitations, the Tufts Center for Regenerative and Developmental Biology has partnered with Wireless Techniques to develop the first automated learning and testing chamber for analyzing behavior in small animals. The chamber uses a Cognex In-Sight Micro vision system instead of a human researcher to observe the behavior of test subjects.
Tufts researchers are using the new testing chamber to study the molecular mechanisms underlying the ability of living things to learn from their environment. Light stimuli are used to train worms and tadpoles on specific tasks, and the animals are then tested for recall in a variety of molecular-genetic and pharmacological experiments. The new tracking system provides quantitative data on the subjects’ behavior and performance in learning tests. As a unique system, it is the first to not only allow tracking of animal movement, but to also provide parallel, independent feedback to each subject so they can learn specific tasks. Simple animals such as flatworms share many of the same behavioral pathways and neurotransmitters with human beings. Accordingly, these animals are often studied to better understand the properties of memory storage and transmission in tissue. The new chamber makes it possible to test new drug compounds to determine if they impact cognitive ability.
According to Professor Michael Levin, Director of The Tufts Center for Regenerative and Developmental Biology, “Modern cognitive science is striving to understand the connection between molecular genetics and the information processing mechanisms that give rise to behavior and thought. The biomedical aspect of this goal includes the search for drugs that will aid learning and memory and the understanding of various influences on cognition.”
In a typical experiment, worms will be trained to stay in or avoid specific parts of the dish, or to move at specific rates. Worms that successfully perform the task will be rewarded by lowered light levels, as worms naturally prefer the dark.
Until now, studies have been performed manually. However, the manual approach of assessing behavior puts significant limits on experimental progress. Only a limited number of animals can be analyzed by hand due to manpower and cost limitations. Manual handling may also allow the results to be affected by the judgment and errors of the person running the experiment. For example, the lack of consensus on the learning abilities of flatworms has been attributed to the small sample sizes that have been required by manual training. Manual methods make it difficult or impossible for other labs to replicate results and for other scientists to view the original experiment and potentially uncover trends that might have been missed by the experimenter.
The Tufts Center selected Wireless Techniques to design and build an automated learning and testing chamber that could provide real-time feedback without a human researcher. Wireless Techniques (now through its successor-in-interest Boston Engineering Corporation —a product and systems development services firm with its main office in Waltham, MA—which acquired a substantial amount of Wireless Techniques’ assets) designs and builds custom electronic devices and instrumentation for applications including wireless and wired communications, sensing, and signal processing. Cognex was chosen as the vision system supplier because its sophisticated image processing tools could determine the position of the worms despite complicated shadowing effects created by the movement of water in the test chamber.
How the Chamber Works
The chamber consists of 12 cells arranged in a grid, each holding a disposable Petri dish where the worm lives. The environment in each cell is individually controlled by the software depending on the behavior of the animal within. The lid contains a series of light emitting diodes (LEDs) controlled by a computer that are used to train worms. A set of four bright LEDs can be set to illuminate a single quadrant of the dish, and barriers prevent the light from spreading to adjacent quadrants. Red LEDs that cannot be seen by the worms are applied at all times during experiments so that the vision system can track the motion of the worms without having any effect on the worm’s behavior. Electrodes in the dish allow the experimenter to also provide weak electrical signals to the animals.
|Top view of the group of four “Experiment Environment Modules” with two of the Illumination Heads open to show access to the Shock Electrode Holder and Petri Dish.|
Each experiment is controlled by an algorithm written by Levin’s team. First, the position of the worms in their dishes is recorded by the vision camera, followed by a certain action, such as turning on a light in one quadrant of the dish with the goal of teaching the animal to swim to the lighted quadrant. Next, the position of the worm is once more recorded by the vision camera. Based on the position (and second-order quantities like speed, direction of movement, etc.) of the worm, another action might be taken such as rewarding the worm by turning down the lights because it swam to the correct quadrant, or turning on a bright light because it did not perform the task properly. These series of measurements and actions can continue until the program reaches a predefined condition (a level of performance indicating that the animal has understood the task to be learned).
Since the system is automated, 12 experiments can be run simultaneously seven days a week, 24 hours a day without human intervention. As a result, much larger sample sizes can be achieved, and experiments can also be run for much longer periods. Millions of observation and training cycles can be performed, creating a level of training far beyond what can realistically be accomplished by manual methods. The system also provides complete consistency among experiments, allowing labs to replicate experiments performed elsewhere, and reduce the amount of noise in the data. Additionally, the vision system records the worms’ motion, meaning it can be easily reviewed and analyzed by other experts over the Internet.
Overcoming the Vision Challenge
“Machine vision was one of the greatest challenges in this automated learning system,” said Chris Granata, former President of Wireless Techniques, now Program Manager, Wireless and Sensing Technologies at Boston Engineering. “The water touching the sides of the dish creates a meniscus that rises and falls. This creates shadows that change over time and are difficult to distinguish from the worms. This application requires a vision system with powerful vision tools that are capable of identifying the location of the worm and is completely self-contained in a compact package so we can easily increase the number of cells. The Cognex In-Sight Micro 1400 was ideal for this application because of its broad toolset and the fact that the entire system is contained in a 30 mm x 30 mm by 60 mm enclosure.”
Three experiment cells as viewed by the Cognex Insight Micro-1400.
In order to reliably differentiate worms from randomly changing water shadows, images of empty quadrants are captured every 20 seconds. The action is accomplished by tracking the worm’s position and capturing quadrants while they are not occupied by the worm. When the system captures an image of the worm in a quadrant, it subtracts the most recent image of the same quadrant when it was not occupied by the worm in order to remove the shadows and more accurately determine the position of the worm.
A histogram tool is used to identify and group the lightest colored pixels, which determine possible positions of the worm. Several convolution and morphological filters are used to enhance the image. For example, morphological dilation filtering is used to connect white pixels in close proximity to each other and smooth out edges of white islands. Next, a blob detection tool picks out the three largest groups of light colored pixels and sorts them in order of size. In almost every case, the largest object is the worm; however, multiple objects are tracked to address the rare possibility that one or more shadows may be larger than the worm.
Scientists hope to use this research to make discoveries about the molecular basis of memory and to develop the latest nootropic drugs.
“We are using quantitative automated behavior analysis techniques to ask how and where information is encoded and how it can be imprinted upon the regenerating brain by other tissues,” said Levin. “Genetic changes can be made to the worms and then their learning performance can be measured in the chamber in order to understand which genes affect learning and memory. This chamber also provides a very powerful tool for investigating the mechanisms of memory and behavior and for drug screening of new nootropic compounds designed to treat conditions such as attention-deficit hyperactivity disorder (ADHD), drug addiction, etc. as well as counteract effects of neurotoxins and improve cognitive performance. Automating the training and testing process will enable us to make faster progress by running many more experiments on a 24/7 basis instead of just when human experimenters are available; moreover, the quantitative data (impossible to obtain with human observers) will reveal unprecedented insights into the processes of learning and memory.”
Mark W. Smithers is VP/COO at Boston Engineering Corporation. He is responsible for overseeing general engineering operations support such as facilities, communications, productivity tools, development standards and CAD/CAE systems. Smithers can be reached at 781-314-0714 or firstname.lastname@example.org .