Tag Archive: Robot


Urine-powered robot heart (Credit: Peter Walters et. al 2013, Bioinspiration and Biomimetics)

The fictional human-powered machines that appear in the The Matrix trilogy are still far from reality — but maybe not that far. Last month, scientists at the Bristol Robotics Laboratory in the UK announced they had successful create a prototype robotic “heart” that runs on human urine, fabricated with a 3D printer. A full working robot is still under development. For now, researchers built only the heart itself out of a rubber-like 3D printed material known as TangoPlus, and demonstrated its ability to charge up to 3.5 volts and perform 33 pumps using just 2 milliliters of “fresh” human urine. But they have ambitious ideas for a future fleet of ecologically-friendly robots, or EcoBots, “powered by energy from waste collected from urinals at public lavatories.”

Continue reading

Not all cubes are created equal. Not all robots are either, as it turns out. The proof is in the buildings at MIT, where a number of colorful blocks called M-blocks are busy reconfiguring themselves into whatever arrangement they want. You can be used to create modular robots that are essentially indestructible, and if you think this sounds a bit too much like Terminator, that’s because IT’S EXACTLY TERMINATOR.

Each block has a flywheel inside it, which can move at 20,000 revolutions per minute but can also stop within 10 milliseconds. When that stop occurs, the angular momentum causes the cube to flip. Magnets on the sides of each cube let them stick to each other, and by moving cubes around, it’s possible to create all kinds of different shapes. The cubes are powerful enough that they have lots of different ways to get around, too: “a low amount of energy will cause it just to roll forward, an intermediate amount of energy might cause it to climb a wall and the highest amount of energy will cause it to do something like jump,” says MIT roboticist Kyle Gilpin.

So what can these M-Blocks actually do? Well, not all that much at the moment, aside from arrange themselves into cool patterns and such. But the future could hold great applications for such a product. They could help build temporary bridges and structures, and they could move themselves over hard-to-navigate terrain. And as the cubes get smaller and more capable, they definitely won’t form themselves into a killer robot that goes back in time and destroys all humans.

Awesome video below.

Nobody actually enjoys going to the doctor, but it’s just something in life that’s pretty much unavoidable — until now. Sharp recently unveiled its solution to a traditional physical examination at CEATEC 2013: a digital doctor’s chair that can simultaneously record blood pressure, temperature, pulse and other vital signs for optimum health awareness.

The chair, which is royally named the “Emperor 1510,” is a compact, sensor-laden piece of equipment that stores your health statistics in cloud storage that most likely can be accessed by your physician for reference. The chair also does away with time-consuming procedures to test each section of your body separately with different machines, which can be frustrating for impatient patients.

Three monitor displays are situated on the top of the chair in order to display your vital stats as the Emperor 1510 does its scan. A blood pressure monitor strap is located on the right armrest of the chair, while the left side has a device where a finger can be inserted, possibly to test for pulse or temperature.

Since the machine can effectively scan and record your body’s condition in one fell swoop, picking up or identifying early symptoms for sickness shouldn’t be a problem, either. This might lead to quicker intervention and recovery time, or might even eliminate you getting sick altogether. That’s something I’ll sit down for!

The machine doesn’t come cheap at a starting price of $5,950, but I’m sure rich, doctor-hating folk will definitely spring for a model to put in their west wing. Health doesn’t come cheap, after all!

Researchers at Toshiba’s Akimu Robotic Research Institute were thrilled ten months ago when they successfully programmed Kenji, a third generation humanoid robot, to convincingly emulate certain human emotions. At the time, they even claimed that Kenji was capable of the robot equivalent of love. Now, however, they fear that his programming has taken an extreme turn for the worst.

“Initially, we were thrilled to see a bit of our soul come alive in this so-called ‘machine,’” said Dr. Akito Takahashi, the principal investigator on the project. “This was really the last step for us in one of the fundamentals of the singularity.”

Kenji was part of an experiment involving several robots loaded with custom software designed to let them react emotionally to external stimuli. After some limited environmental conditioning, Kenji first demonstrated love by bonding with a stuffed doll in his enclosure, which he would embrace for hours at a time. He would then make simple, but insistent, inquiries about the doll if it were out of sight. Researchers attributed this behavior to his programmed qualities of devotion and empathy and called the experiment a success.

What they didn’t count on were the effects of several months of self-iteration within the complex machine-learning code which gave Kenji his first tenderness. As of last week, Kenji’s love for the doll, and indeed anybody he sets his ‘eyes’ on, is so intense that Dr. Takahashi and his team now fear to show him to outsiders.

Continue reading

A humanoid robot designed to alleviate the isolation of astronauts by having conversations has been deployed to the International Space Station (ISS).

The world’s first talking robot to be sent into space has been dispatched on his mission to the ISS by Japanese space agency JAXA over the weekend. Kirobo, modelled on Astro Boy, is expected to arrive on the station on 9 August, where he will join future ISS commander Kochi Wakata as a friendly robotic companion.

The aim of the 34-centimetre, 1-kilogram robot is to study whether machines can lend emotional support to humans in isolated conditions. Its name, which is derived from the Japanese words for “hope” and “robot”, is reflective of this idea. “Hope of the Japanese technology,” the website says. “Hope for tomorrow’s children. It carries hope on its small shoulders. Hope for the future of humans living together with robots.”

Continue reading

Mr. Spock may think space is the final frontier, but Earth’s deep oceans are just as mysterious and unknown. Now, one scientist says thousands of people could explore the oceans using cheap, remotely controlled robots.

“The deep has even more cool stuff than space,” said Eric Stackpole, a researcher at NASA Ames Research Center in Mountain View, Calif.

For instance, there’s a possibility that alien life exists on distant exoplanets, but scientists know for sure that hundreds or thousands of undiscovered species lurk beneath the waves, Stackpole said Sunday (May 19) here at the 2013 Maker Faire Bay Area, a two-day celebration of DIY science, technology and engineering.

And it’s not just scientists who can do the discovering: Interested amateurs could launch an army of these DIY submarines to reveal the mysteries of the deep.

Stackpole is the co-founder of OpenROV, an organization that has created an open-source, underwater vehicle that can explore up to 328 feet (100 meters) beneath the ocean’s surface.

The submarine is the size of a shoebox and is made with off-the-shelf, $10 electronics, such as thrusters and motors, and laser-cut acrylic. It wirelessly connects to a remote controller, and can descend at 3.28 feet (1 meter) per second.

So far, these rovers have plumbed the depths of the flooded Hall City Cave in northern California, which is rumored to harbor gold, and spent time at the Aquarius Underwater Laboratory off Key Largo, Fla., where aquanauts spend 10 days underwater. An open remotely operated vehicle (ROV) has even dived beneath the ice in the Ross Sea in Antarctica, where penguins also got cozy with the underwater vehicle.

Continue reading

In the very early hours of the morning, in a Harvard robotics laboratory last summer, an insect took flight. Half the size of a paperclip, weighing less than a tenth of a gram, it leapt a few inches, hovered for a moment on fragile, flapping wings, and then sped along a preset route through the air.

Like a proud parent watching a child take its first steps, graduate student Pakpong Chirarattananon immediately captured a video of the fledgling and emailed it to his adviser and colleagues at 3 a.m. — subject line, “Flight of the RoboBee.”

“I was so excited, I couldn’t sleep,” recalls Chirarattananon, co-lead author of a paper published this week in Science.

The demonstration of the first controlled flight of an insect-sized robot is the culmination of more than a decade’s work, led by researchers at the Harvard School of Engineering and Applied Sciences (SEAS) and the Wyss Institute for Biologically Inspired Engineering at Harvard.

“This is what I have been trying to do for literally the last 12 years,” says Robert J. Wood, Charles River Professor of Engineering and Applied Sciences at SEAS, Wyss Core Faculty Member, and principal investigator of the National Science Foundation-supported RoboBee project. “It’s really only because of this lab’s recent breakthroughs in manufacturing, materials, and design that we have even been able to try this. And it just worked, spectacularly well.”

Inspired by the biology of a fly, with submillimeter-scale anatomy and two wafer-thin wings that flap almost invisibly, 120 times per second, the tiny device not only represents the absolute cutting edge of micromanufacturing and control systems; it is an aspiration that has impelled innovation in these fields by dozens of researchers across Harvard for years.

Continue reading

Pollinators participate in the sexual-reproduction of plants. When you eat an almond, beet, watermelon or sip on coffee, you’re partaking of an ancient relationship between pollinators and flowers. But since the 1990s, worldwide bee health has been in decline and most evidence points to toxic pesticides created by Shell and Bayer and the loss of genetic biodiversity due to the proliferation of GMO monocrops created in laboratories by biotech companies like Monsanto.

But never worry, those real life pollinators—the birds and the bees, as they say—may soon be irrelevant to the food needs of civilization. Harvard roboticists are developing a solution to the crisis: swarms of tiny robot bees made of titanium and plastic that can pollinate those vast dystopian fields of GMO cash crops.

Continue reading

The first bionic hand that allows an amputee to feel what they are touching will be transplanted later this year in a pioneering operation that could introduce a new generation of artificial limbs with sensory perception.

The patient is an unnamed man in his 20s living in Rome who lost the lower part of his arm following an accident, said Silvestro Micera of the Ecole Polytechnique Federale de Lausanne in Switzerland.

The wiring of his new bionic hand will be connected to the patient’s nervous system with the hope that the man will be able to control the movements of the hand as well as receiving touch signals from the hand’s skin sensors.

Dr Micera said that the hand will be attached directly to the patient’s nervous system via electrodes clipped onto two of the arm’s main nerves, the median and the ulnar nerves.

This should allow the man to control the hand by his thoughts, as well as receiving sensory signals to his brain from the hand’s sensors. It will effectively provide a fast, bidirectional flow of information between the man’s nervous system and the prosthetic hand.

“This is real progress, real hope for amputees. It will be the first prosthetic that will provide real-time sensory feedback for grasping,” Dr Micera said.

Continue reading

Scientists build the One Million Dollar man: how to build a bionic man

When Luke Skywalker received a perfect bionic replacement for the hand that was cut off in Star Wars Episode V, the idea of replicating human organs and body parts seemed far-fetched.

Thirty years later, the idea is no longer just science fiction. Scientists, among them the creators of “Rex” – the world’s most complete bionic man, unveiled in London this week – believe they can now replicate about two-thirds of the human body.

“We were surprised how many of the parts of the body can be replaced,” said Rich Walker, managing director of the robotics team Shadow, who built Rex. “There are some vital organs missing, like the stomach, but 60 to 70 per cent of a human has effectively been rebuilt.” This is heralded, then, as the dawn of the age of bionic man – although specialists caution that we are still feeling our way.

Social psychologist Bertolt Meyer, who also worked on Rex, has an interesting perspective: he was born without his left hand and has a prosthesis. “I have looked for new bionic technologies out of personal interest for a long time and I think that until five or six years ago nothing much was happening,” he said. “Suddenly we are at a point where we can build a body that is great and beautiful in its own special way.”

Not everyone in the field believes the recent progress, impressive as it is, places us on the road to complete replication of human limbs, organs and tissue. “We have motors which can lift things but, if you want to mimic the dexterity of a hand, we are not there yet,” said Professor Steven Hsiao of the John Hopkins University in Baltimore.

“What we are beginning to achieve is building prostheses which look like human body parts, but we are a long way away from making ones which relay sensory information the way the human body does.”

Professor Hsiao drew the comparison between Star Wars and real life, saying: “The goal is the scene in the film where Luke Skywalker gets his new hand tested and is able to feel pain: we are not there. In 10 years, we will be able to build a robot which has the dexterity to pick up a pen and write with it, but it will not be able to send back sensory information.”

Continue reading

This is a video of an industrial robot that’s been programmed to carve logs into stools with a chainsaw, at least for now!

Lockheed Martin’s latest promo video of the HULC exoskeleton designed to ease a Soldier’s load by turning him into a temporary robot.

You gotta hand it to the marketers who come up with robot acronyms. Can it get any better than Extreme Access System for Entry (EASE)?

Sounds innocuous enough, right? Until this little critter tries to float into your room to spy on you. It’s one of two bots unveiled by CyPhy Works, headed by iRobot co-founder Helen Greiner.

EASE and PARC (that’s Persistent Aerial Reconnaissance & Communication), a communications relay, are compact flying machines that can fly between 3 feet and 1,000 feet while remaining tethered to their human controllers via microfilaments.

The cable serves as a source of power and a medium for HD video transmission, keeping the bots aloft for longer than traditional wireless UAVs and preventing jamming by enemies. EASE and PARC are designed to stick to one area instead of flying long distances.

They could be used by military, police, and first responders in disasters to scope out buildings and search for enemy soldiers, criminals, or people in distress. They could also inspect buildings, bridges, and other structures for damage.

Check out the vid below of EASE being put to the test at Fort Benning, Georgia. What would you do if you saw it outside your window?

In what looks like a robot scene pulled from The Terminator, a government agency has released a video of a search-and-rescue robot that can do everything from climb stairs to crossing narrow passages

The Defense Advanced Research Projects Agency (DARPA) — which is a part of the U.S. Department of Defense — uploaded the video to YouTube to bring attention to the DARPA Robotics Challenge (DRC). The contest is looking for robots who can maneuver and assist during dangerous and disaster relief situations. The winning team will be rewarded $2 million.

The robot in the video — which is called Pet-Proto and is the predecessor to DARPA’s Atlas robot — undergoes a series of obstacles similar to what robots will face in the challenge. The robot has decision-making abilities to determine the best route to go, when to jump and what to avoid.

HAL brain controlled cyberdyne exoskeleton ful body suit nuclear fukusima Japan e1350589234294 New HAL Exoskeleton: Brain Controlled Full Body Suit to Be Used In Fukushima Cleanup

Japanese company Cyberdyne announced today an improved version of HAL (Hybrid Assistive Limb), the exoskeleton which we wrote about almost two years ago, when a tech journalist took a few steps at CES 2011 wearing the brain-controlled cyber-trousers. The latest version of HAL has remained brain-controlled but evolved to a full body robot suit that protects against heavy radiation without feeling the weight of the suit. Eventually it could be used by workers dismantling the crippled Fukushima nuclear plant.

The new type of HAL is on display today at the Japan Robot Week exhibition in Tokyo. It will be used by workers at nuclear disaster sites and will be field tested at Fukushima, where a tsunami in March 2011 smashed into the power plant, sparking meltdowns that forced the evacuation of a huge area of northeastern Japan.

HAL – coincidentally the name of the evil supercomputer in Stanley Kubrick’s “2001: A Space Odyssey” – has a network of sensors that monitor the electric signals coming from the wearer’s brain. It uses these to activate the robot’s limbs in concert with the worker’s, taking weight off his or her muscles.

Yoshiyuki Sankai, professor of engineering at the University of Tsukuba, said this means the 60-kilogramme (130-pound) tungsten vest workers at Fukushima have to wear is almost unnoticeable. He said the outer layer of the robot suit also blocks radiation, while fans inside it circulate air to keep the wearer cool, and a computer can monitor their heart-rate and breathing for signs of fatigue.

Continue reading

Bio-scaffolds go electric <i>(Image: Charles M. Lieber and Daniel S. Kohane)</i>

They beat like real heart cells, but the rat cardiomyocytes in a dish at Harvard University are different in one crucial way. Snaking through them are wires and transistors that spy on each cell’s electrical impulses. In future, the wires might control their behaviour too.

Versions of this souped-up, “cyborg” tissue have been created for neurons, muscle and blood vessels. They could be used to test drugs or as the basis for biological versions of existing implants such as pacemakers. If signals can also be sent to the cells, cyborg tissue could be used in prosthetics or to create tiny robots.

“It allows one to effectively blur the boundary between electronic, inorganic systems and organic, biological ones,” says Charles Lieber, who leads the team behind the cyborg tissue.

Artificial tissue can already be grown on three-dimensional scaffolds made of biological materials that are not electrically active. And electrical components have been added to cultured tissue before, but not integrated into its structure, so they were only able to glean information from the surface.

Continue reading

Nico looking in a mirror

A robot named Nico could soon pass a landmark test – recognising itself in a mirror.

Such self-awareness would represent a step towards the ultimate goal of thinking robots.

Nico, developed by computer scientists at Yale University, will take the test in the coming months.

The ultimate aim is for Nico to use a mirror to interpret objects around it, in the same way as humans use a rear-view mirror to look for cars.

“It is a spatial reasoning task for the robot to understand that its arm is on it not on the other side of the mirror,” Justin Hart, the PhD student leading the research told BBC News.

So far the robot has been programmed to recognise a reflection of its arm, but ultimately Mr Hart wants it to pass the “full mirror test”.

Continue reading

University of Florida researchers have moved a step closer to treating diseases on a cellular level by creating a tiny particle that can be programmed to shut down the genetic production line that cranks out disease-related proteins. In laboratory tests, these newly created “nanorobots” all but eradicated hepatitis C virus infection. The programmable nature of the particle makes it potentially useful against diseases such as cancer and other viral infections. The research effort, led by Y. Charles Cao, a UF associate professor of chemistry, and Dr. Chen Liu, a professor of pathology and endowed chair in gastrointestinal and liver research in the UF College of Medicine, is described online this week in the Proceedings of the National Academy of Sciences. “This is a novel technology that may have broad application because it can target essentially any gene we want,” Liu said. “This opens the door to new fields so we can test many other things. We’re excited about it.”

During the past five decades, nanoparticles — particles so small that tens of thousands of them can fit on the head of a pin — have emerged as a viable foundation for new ways to diagnose, monitor and treat disease. Nanoparticle-based technologies are already in use in medical settings, such as in genetic testing and for pinpointing genetic markers of disease. And several related therapies are at varying stages of clinical trial. The Holy Grail of nanotherapy is an agent so exquisitely selective that it enters only diseased cells, targets only the specified disease process within those cells and leaves healthy cells unharmed.

Continue reading

The art of humorous storytelling in Japan, known as rakugo, isn’t as popular as it once was. But now an android has joined the ranks of comics who kneel on cushions while spinning out jokes. The narrative droid is a copy of Beicho Katsura III, an 86-year-old rakugo comic recognized by the government as a Living National Treasure. The Beicho Android, as it’s known, is the work of Osaka University professor Hiroshi Ishiguro, creator of the Geminoid series of lifelike androids, and makeup artist Shinya Endo. Powered by air servos, the droid has all the idiosyncratic moves of Beicho performing rakugo, an art in which performers wear kimono and use only a kerchief and hand fan as props.

As seen in the vid below, it waves its arms, bows its head, and speaks in a gravelly voice like the master while narrating tales. Its mouth isn’t all that expressive but from far away, it’s hard to notice. The robot cracked up a few journalists at a press conference. It took two months to build and cost some $1 million, according to Sankei News. It was unveiled as part of an exhibition that combines a retrospective on Beicho’s career with exhibits on cutting edge tech in Osaka. It’s on from August 1 to 9 at Sankei Hall Breeze, where the droid is slated to do hourly impersonations of the elderly artist.

Hot Glue Gun Bot The Hot Melt Adhesive robot can do more than climb walls, as seen here — it can fashion its own tools, too. ETH Zurich

Most robots are designed to do a couple specific things, which is one reason why the adaptability requirements in DARPA’s robotics challenge will be so interesting. But not everyone has the funds or know-how to build a robot that can do anything. Instead, the robotics whiz teams at ETH Zurich are giving robots the ability to build any new tool for itself, whenever the need might arise. It just comes with a hot glue gun, which the robot uses like a low-tech 3-D printer.

Other robots have used hot glue guns before, primarily to climb up walls — Israeli researchersand the ETH researchers themselves are among those who’ve built such surface-scaling bots. But if you’ve ever played with a hot glue gun, you know the tool can be used to do much more than form an adhesive — you could make any shape you want and simply let it cool, hardening into an milky-looking object of your design. That’s what this new robot does.

It uses hot glue to form a base and sides of a cup one layer at a time, much like a 3-D printer would sinter materials one layer at a time. The robot also builds a handle and attaches it to the cup so it can tote the water vessel between two separate containers. It takes about an hour. All the tasks were performed autonomously, reports IEEE Spectrum, which spotted the bot at the ICRA conference over the weekend. But the cup design was pre-programmed.

Ideally, future versions of this robot would be able to figure out exactly what type of tool is needed for a given task, and be able to design and build said tool. It certainly works more slowly than a 3-D printer, but it’s much simpler, too. Watch in the video below.

Follow

Get every new post delivered to your Inbox.

Join 264 other followers

%d bloggers like this: