Tag Archive: robots


human-robot competition Shutterstock

Researchers have developed a tiny robotic muscle that’s 1,000 time stronger than a human muscle.

The team of researchers at the University of California-Berkeley found that vanadium dioxide changes from an insulator to a conductive metal at about 152 degrees, which produced a huge amount of strength during the transition.

The scientists used the material to demonstrate a microchip-sized, twisting robotic motor that could catapult objects 50 times heavier than itself over a distance five times longer than itself faster than the blink of an eye – within 60 milliseconds.

The team fabricated the micro-muscle from a long V-shaped ribbon made of chromium and vanadium dioxide, which is already prized for its ability to change size, shape and physical identity, and heated it with a tiny pad or by electrical current.

Continue reading

Plants could soon have robotic counterparts. Barbara Mazzolai from the Italian Institute of Technology in Genoa and colleagues are creating a system that mimics the behaviour of roots. The team plans to use bespoke soft sensors for underground exploration, tips that grow by unwinding material and a mechanism to reduce friction when penetrating the soil. The artificial system will be equipped to detect gravity, water, temperature, touch, pH, nitrate and phosphate.

Modelling a growing root is complex because it bends while increasing in length, adding cells on the opposite side from the direction in which it is heading. At the same time, a root perceives several physical and chemical stimuli at once and prioritises them; how it makes these decisions is not completely understood. “The mock-ups and prototypes we’ve developed aim to validate some of the functions and features of plant roots,” says Mazzolai.

Continue reading

A humanoid robot designed to alleviate the isolation of astronauts by having conversations has been deployed to the International Space Station (ISS).

The world’s first talking robot to be sent into space has been dispatched on his mission to the ISS by Japanese space agency JAXA over the weekend. Kirobo, modelled on Astro Boy, is expected to arrive on the station on 9 August, where he will join future ISS commander Kochi Wakata as a friendly robotic companion.

The aim of the 34-centimetre, 1-kilogram robot is to study whether machines can lend emotional support to humans in isolated conditions. Its name, which is derived from the Japanese words for “hope” and “robot”, is reflective of this idea. “Hope of the Japanese technology,” the website says. “Hope for tomorrow’s children. It carries hope on its small shoulders. Hope for the future of humans living together with robots.”

Continue reading

In recent years robots have been touted as the solution for a wide range of issues, such as assisting law enforcement in dealing with potentially explosive devices to providing aid to the elderly during home health care. However, the precision and delicate touch needed in such situations have remained slightly out of reach for even the most advanced robots, until now.

A group of roboticists at the Department of Biomedical Engineering at the Georgia Institute of Technology in Atlanta have developed a touch system that allows a robot to feel its way around various situations. Publishing their results in the International Journal of Robotics Research, the team revealed a new kind of robotic mechanism that uses a combination of touch and sight to navigate delicate maneuvers. Facilitated by what the team calls “artificial skin,” the development allows to robotic arm to feel its way through clutter and actually pick out specific objects, in much the same way as a human would.

Image of robot ants

Robots built to mimic ants suggest that real ants waste little, if any, mental energy deciding which way to go when they reach an uneven fork in the road, according to a new study. Instead, the ants just take the easiest route as dictated by geometry.

“The shape of their network relieves some of the cognitive load for the ants; they don’t need to think about it,” Simon Garnier, a biologist at the New Jersey Institute of Technology, told NBC News. “The shape of their networks has constrained their movement in a way that is more efficient for them.”

The findings have implications for understanding ant biology as well as how humans design transportation networks for the flow of people, information and goods.

Continue reading

It’s a staggering modern-day irony that the most common complication for hospital patients is acquiring an infection during their visit, affecting 1 in 20 patients in the US. It’s a problem estimated to cause millions of infections with 100,000 or so leading to death per year and a whopping $45 billion annually in hospital costs. If this isn’t bad enough, the tragedies from deadly superbugs within healthcare facilities are on the rise and will likely continue as the last lines of antibiotics fail without any new drugs moving fast enough up the pipeline to help.

Fortunately, an alternative to medication promises to vastly improve the disinfection of hospital rooms, thanks to a UV light-emitting robot from Xenex Healthcare.

Using a pulsed-xenon UV lamp, the portable bot shoots out 120 flashes of light per minute. Each pulse lasts a thousandth of a second each, and a typical treatment runs for 10 to 20 minutes. The UV rays pass through the outer wall of a bacterium and damage its DNA, making it impossible for it to mutate or reproduce. This stops the pathogen from propagating or being harmful.

Continue reading

HAL brain controlled cyberdyne exoskeleton ful body suit nuclear fukusima Japan e1350589234294 New HAL Exoskeleton: Brain Controlled Full Body Suit to Be Used In Fukushima Cleanup

Japanese company Cyberdyne announced today an improved version of HAL (Hybrid Assistive Limb), the exoskeleton which we wrote about almost two years ago, when a tech journalist took a few steps at CES 2011 wearing the brain-controlled cyber-trousers. The latest version of HAL has remained brain-controlled but evolved to a full body robot suit that protects against heavy radiation without feeling the weight of the suit. Eventually it could be used by workers dismantling the crippled Fukushima nuclear plant.

The new type of HAL is on display today at the Japan Robot Week exhibition in Tokyo. It will be used by workers at nuclear disaster sites and will be field tested at Fukushima, where a tsunami in March 2011 smashed into the power plant, sparking meltdowns that forced the evacuation of a huge area of northeastern Japan.

HAL – coincidentally the name of the evil supercomputer in Stanley Kubrick’s “2001: A Space Odyssey” – has a network of sensors that monitor the electric signals coming from the wearer’s brain. It uses these to activate the robot’s limbs in concert with the worker’s, taking weight off his or her muscles.

Yoshiyuki Sankai, professor of engineering at the University of Tsukuba, said this means the 60-kilogramme (130-pound) tungsten vest workers at Fukushima have to wear is almost unnoticeable. He said the outer layer of the robot suit also blocks radiation, while fans inside it circulate air to keep the wearer cool, and a computer can monitor their heart-rate and breathing for signs of fatigue.

Continue reading

Sometimes real science sounds more like science fiction. Just the phrase “bionic bees” sounds like something out of an old paperback.

But that’s the goal of a new project from two U.K. universities, the University of Sheffield and the University of Sussex. Engineers from the schools are planning to scan the brains of bees and upload the data into flying robots with the hope that the machines will fly and act like the real thing.

The goal of the project is to create the first robots able to act on instinct. Researchers hope to implant a honey bee’s sense of smell and sight into the flying machines, allowing the robots to act as autonomously as an insect rather than relying on preprogrammed instructions.

Possible applications for the bionic bee include search and rescue missions at sites such as collapsed mines, detecting chemical or gas leaks, and even pollinating plants just like a real bee.

“The development of an artificial brain is one of the greatest challenges in artificial intelligence. So far, researchers have typically studied brains such as those of rats, monkeys, and humans, but actually ‘simpler’ organism such as social insects have surprisingly advanced cognitive abilities,” James Marshall, head of the $1.61 million study, said in a statement.

Continue reading

Remote-roach-622

Finally, someone has designed a way to convert one of the world’s biggest pests into something useful.

Using an electronic interface, a group of researchers from North Carolina State University have developed a method to steer and remotely control cockroaches. Rejoice.

“Our aim was to determine whether we could create a wireless biological interface with cockroaches, which are robust and able to infiltrate small spaces,” Alper Bozkurt said, according to Physorg.com.

Bozkurt, an assistant professor of electrical engineering at NC State, was co-author of the project’s paper, presented recently at the International Conference of the IEEE Engineering in Medicine and Biology Societyin San Diego, Calif.

“Ultimately, we think this will allow us to create a mobile web of smart sensors that uses cockroaches to collect and transmit information, such as finding survivors in a building that’s been destroyed by an earthquake,” he said.

“Building small-scale robots that can perform in such uncertain, dynamic conditions is enormously difficult,” Bozkurt added. “We decided to use biobotic cockroaches in place of robots, as designing robots at that scale is very challenging and cockroaches are experts at performing in such a hostile environment.”

Continue reading

Nico looking in a mirror

A robot named Nico could soon pass a landmark test – recognising itself in a mirror.

Such self-awareness would represent a step towards the ultimate goal of thinking robots.

Nico, developed by computer scientists at Yale University, will take the test in the coming months.

The ultimate aim is for Nico to use a mirror to interpret objects around it, in the same way as humans use a rear-view mirror to look for cars.

“It is a spatial reasoning task for the robot to understand that its arm is on it not on the other side of the mirror,” Justin Hart, the PhD student leading the research told BBC News.

So far the robot has been programmed to recognise a reflection of its arm, but ultimately Mr Hart wants it to pass the “full mirror test”.

Continue reading

University of Florida researchers have moved a step closer to treating diseases on a cellular level by creating a tiny particle that can be programmed to shut down the genetic production line that cranks out disease-related proteins. In laboratory tests, these newly created “nanorobots” all but eradicated hepatitis C virus infection. The programmable nature of the particle makes it potentially useful against diseases such as cancer and other viral infections. The research effort, led by Y. Charles Cao, a UF associate professor of chemistry, and Dr. Chen Liu, a professor of pathology and endowed chair in gastrointestinal and liver research in the UF College of Medicine, is described online this week in the Proceedings of the National Academy of Sciences. “This is a novel technology that may have broad application because it can target essentially any gene we want,” Liu said. “This opens the door to new fields so we can test many other things. We’re excited about it.”

During the past five decades, nanoparticles — particles so small that tens of thousands of them can fit on the head of a pin — have emerged as a viable foundation for new ways to diagnose, monitor and treat disease. Nanoparticle-based technologies are already in use in medical settings, such as in genetic testing and for pinpointing genetic markers of disease. And several related therapies are at varying stages of clinical trial. The Holy Grail of nanotherapy is an agent so exquisitely selective that it enters only diseased cells, targets only the specified disease process within those cells and leaves healthy cells unharmed.

Continue reading

robots artificial intelligence

For all the speech lines we hear about jobs these days, rarely does anyone mention robots.

They do occasionally, but usually it’s saved for the “innovation” speeches. This is understandable. If you’re running for office, better to keep the two ideas separated, because while jobs are good because they’re, well, jobs, and robots are good because they mean progress, mix the two together and soon enough people will start asking how you’ll be able to create a lot of jobs if these really smart machines are doing more and more of the work.

No, I’m not going all Luddite on you. I’m in awe of machines and the remarkable things they can now do. But that’s the point. We’re not talking about the technology of the past, which clearly made humans more productive and allowed us to move into better-paying jobs requiring more specialized skills.

Now we’re creating machines that are much more than tools. They’re learning to think and adapt, and technologists such as Martin Ford, author of Lights in the Tunnel: Automation, Accelerating Technology and the Economy of the Future, believe that within five to ten years, machines will be able to surpass the ability of humans to do routine work. As he told The Fiscal Times: “It’s the first time we’ve had this level of technology that allows machines to solve problems on their own, to interact with their environment, to analyze visual imagines, and to manipulate their environment based on that.”

Continue reading

What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel — or at least the ability to identify different materials by touch.

Researchers at the University of Southern California’s Viterbi School of Engineering published a study June 18 in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.

The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.

Like the human finger, the group’s BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.

Continue reading

A number of life-support machines are connected to each other, circulating liquids and air in attempt to mimic a biological structure. The Immortal investigates human dependence on electronics, the desire to make machines replicate organisms and our perception of anatomy as reflected by biomedical engineering.

RTR2YCJF.jpg

Jianhui manipulates objects with his hands and gets a drink as a reward. Unknown to him, not far away a robot hand mirrors his fingers’ moves as it receives instructions from the chips implanted in his brain.

Zheng Xiaoxiang of the Brain-Computer Interface Research Team at Zhejiang University in Zijingang, China, and colleagues announced earlier this week that they had succeeded in capturing and deciphering the signals from the monkey’s brain and interpreting them into the real-time robotic finger movements.

The two sensors implanted in Jianhui’s brain monitor just 200 neurons in his motor cortex, Zheng says. However, this was enough to accurately interpret the monkey’s movements and control the robotic hand.

Humans have used electrodes to control prosthetic arms, but Zheng claims this research looks at the finer movements of the fingers.

“Hand moves are associated with at least several hundreds of thousands of neurons,” she said. “We now decipher the moves based on the signals of about 200 neurons. Of course, the orders we produced are still distant from the truly flexible finger moves in complexity and fineness.”

Cyborg bugs as first responders

The principal idea is to harvest the insect’s biological energy from either its body heat or movements. The device developed by engineers at the University of Michigan converts the kinetic energy from wing movements into electricity—prolonging battery life.

The battery can be used to power small sensors implanted on the insect (such as a small camera, a microphone, or a gas sensor) in order to gather vital information from hazardous environments.


“Through energy scavenging, we could potentially power cameras, microphones, and other sensors and communications equipment that an insect could carry aboard a tiny backpack,” says Khalil Najafi, professor of electrical and computer engineering. “We could then send these ‘bugged’ bugs into dangerous or enclosed environments where we would not want humans to go.”

In a paper published in the Journal of Micromechanics and Microengineering, researchers describes several techniques to scavenge energy from wing motion and present data on measured power from beetles.

The research was funded by the Hybrid Insect Micro Electromechanical Systems program of the Defense Advanced Research Projects Agency.

The university is pursuing patent protection for the intellectual property, and is seeking commercialization partners to help bring the technology to market.

It’s alarming enough when robots ingest plant detritus like twigs and grass clippings. It’s another thing entirely when they can start chowing down on members of the animal kingdom. A pair of prototype robots are designed to catch bugs, a major step on the path toward robots that can hunt, catch and digest their own meals.

The tiny robots are modeled after the lobes of Venus flytraps, which snap shut as soon as sensitive hairs inside detect an alighting insect. One prototype, developed at Seoul National University, is made of shape-memory materials that switch between two states when subjected to a current. The other, made at the University of Maine, uses artificial muscles made of a gold nanomaterial.

Continue reading

Follow

Get every new post delivered to your Inbox.

Join 256 other followers

%d bloggers like this: