Finally, someone has designed a way to convert one of the world’s biggest pests into something useful.
Using an electronic interface, a group of researchers from North Carolina State University have developed a method to steer and remotely control cockroaches. Rejoice.
“Our aim was to determine whether we could create a wireless biological interface with cockroaches, which are robust and able to infiltrate small spaces,” Alper Bozkurt said, according to Physorg.com.
Bozkurt, an assistant professor of electrical engineering at NC State, was co-author of the project’s paper, presented recently at the International Conference of the IEEE Engineering in Medicine and Biology Societyin San Diego, Calif.
“Ultimately, we think this will allow us to create a mobile web of smart sensors that uses cockroaches to collect and transmit information, such as finding survivors in a building that’s been destroyed by an earthquake,” he said.
“Building small-scale robots that can perform in such uncertain, dynamic conditions is enormously difficult,” Bozkurt added. “We decided to use biobotic cockroaches in place of robots, as designing robots at that scale is very challenging and cockroaches are experts at performing in such a hostile environment.”
A robot named Nico could soon pass a landmark test – recognising itself in a mirror.
Such self-awareness would represent a step towards the ultimate goal of thinking robots.
Nico, developed by computer scientists at Yale University, will take the test in the coming months.
The ultimate aim is for Nico to use a mirror to interpret objects around it, in the same way as humans use a rear-view mirror to look for cars.
“It is a spatial reasoning task for the robot to understand that its arm is on it not on the other side of the mirror,” Justin Hart, the PhD student leading the research told BBC News.
So far the robot has been programmed to recognise a reflection of its arm, but ultimately Mr Hart wants it to pass the “full mirror test”.
University of Florida researchers have moved a step closer to treating diseases on a cellular level by creating a tiny particle that can be programmed to shut down the genetic production line that cranks out disease-related proteins. In laboratory tests, these newly created “nanorobots” all but eradicated hepatitis C virus infection. The programmable nature of the particle makes it potentially useful against diseases such as cancer and other viral infections. The research effort, led by Y. Charles Cao, a UF associate professor of chemistry, and Dr. Chen Liu, a professor of pathology and endowed chair in gastrointestinal and liver research in the UF College of Medicine, is described online this week in the Proceedings of the National Academy of Sciences. “This is a novel technology that may have broad application because it can target essentially any gene we want,” Liu said. “This opens the door to new fields so we can test many other things. We’re excited about it.”
During the past five decades, nanoparticles — particles so small that tens of thousands of them can fit on the head of a pin — have emerged as a viable foundation for new ways to diagnose, monitor and treat disease. Nanoparticle-based technologies are already in use in medical settings, such as in genetic testing and for pinpointing genetic markers of disease. And several related therapies are at varying stages of clinical trial. The Holy Grail of nanotherapy is an agent so exquisitely selective that it enters only diseased cells, targets only the specified disease process within those cells and leaves healthy cells unharmed.
For all the speech lines we hear about jobs these days, rarely does anyone mention robots.
They do occasionally, but usually it’s saved for the “innovation” speeches. This is understandable. If you’re running for office, better to keep the two ideas separated, because while jobs are good because they’re, well, jobs, and robots are good because they mean progress, mix the two together and soon enough people will start asking how you’ll be able to create a lot of jobs if these really smart machines are doing more and more of the work.
No, I’m not going all Luddite on you. I’m in awe of machines and the remarkable things they can now do. But that’s the point. We’re not talking about the technology of the past, which clearly made humans more productive and allowed us to move into better-paying jobs requiring more specialized skills.
Now we’re creating machines that are much more than tools. They’re learning to think and adapt, and technologists such as Martin Ford, author of Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, believe that within five to ten years, machines will be able to surpass the ability of humans to do routine work. As he told The Fiscal Times: “It’s the first time we’ve had this level of technology that allows machines to solve problems on their own, to interact with their environment, to analyze visual imagines, and to manipulate their environment based on that.”
What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel — or at least the ability to identify different materials by touch.
Researchers at the University of Southern California’s Viterbi School of Engineering published a study June 18 in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.
The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.
Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.
A number of life-support machines are connected to each other, circulating liquids and air in attempt to mimic a biological structure. The Immortal investigates human dependence on electronics, the desire to make machines replicate organisms and our perception of anatomy as reflected by biomedical engineering.
Jianhui manipulates objects with his hands and gets a drink as a reward. Unknown to him, not far away a robot hand mirrors his fingers’ moves as it receives instructions from the chips implanted in his brain.
Zheng Xiaoxiang of the Brain-Computer Interface Research Team at Zhejiang University in Zijingang, China, and colleagues announced earlier this week that they had succeeded in capturing and deciphering the signals from the monkey’s brain and interpreting them into the real-time robotic finger movements.
The two sensors implanted in Jianhui’s brain monitor just 200 neurons in his motor cortex, Zheng says. However, this was enough to accurately interpret the monkey’s movements and control the robotic hand.
Humans have used electrodes to control prosthetic arms, but Zheng claims this research looks at the finer movements of the fingers.
“Hand moves are associated with at least several hundreds of thousands of neurons,” she said. “We now decipher the moves based on the signals of about 200 neurons. Of course, the orders we produced are still distant from the truly flexible finger moves in complexity and fineness.”
It’s alarming enough when robots ingest plant detritus like twigs and grass clippings. It’s another thing entirely when they can start chowing down on members of the animal kingdom. A pair of prototype robots are designed to catch bugs, a major step on the path toward robots that can hunt, catch and digest their own meals.
The tiny robots are modeled after the lobes of Venus flytraps, which snap shut as soon as sensitive hairs inside detect an alighting insect. One prototype, developed at Seoul National University, is made of shape-memory materials that switch between two states when subjected to a current. The other, made at the University of Maine, uses artificial muscles made of a gold nanomaterial.