UAV / UCAV / LAR (robotit) Uutiset ja jutut

A test project using powerful drones to fly urgent medical samples from isolated Scottish islands is being expanded this winter after successful recent trials.

The tests will involve flying blood and fluid samples from Hebridean islands such as Coll and Tiree to hospital labs on the mainland in a fraction of the time needed to take them by road and ferry.

After a week’s test run from a hospital in Oban to the island of Mull in the spring, the London-based firm Skyports and NHS Highland will test them during the winter to see how well they cope with much tougher flying conditions.

The longer trial, funded by the UK Space Agency, will allow NHS Highland to confirm whether drones can be permanently integrated into its transport of emergency supplies and test samples, and could eventually be used throughout Scotland’s islands.

The agency said its funding, which includes support for projects on mental health and loneliness using satellite technology, was intended to bolster the NHS’s capacity to cope with the coronavirus pandemic.
 
MzY2MTI4OQ.jpeg


As if it’s not hard enough to make very small robots that can , once you’ve gotten the power and autonomy all figured out (good luck with that), your robot isn’t going to be all that useful unless it can carry some payload. And the payload that everybody wants robots to carry is a camera, which is of course a relatively big, heavy, power hungry payload. Great, just great.

This whole thing is frustrating because tiny, lightweight, power efficient vision systems are all around us. Literally, all around us right this second, stuffed into the heads of insects. We can’t make anything quite that brilliant (yet), but roboticists from the University of Washington, in Seattle, have gotten us a bit closer, with the smallest wireless, steerable video camera we’ve ever seen—small enough to fit on the back of a microbot, or even a live bug.
 

The vast majority of drones are rotary-wing systems (like quadrotors), and for good reason: They’re cheap, they’re easy, they scale up and down well, and we’re getting quite good at controlling them, even in very challenging environments. For most applications, though, drones lose out to birds and their flapping wings in almost every way—flapping wings are very efficient, enable astonishing agility, and are much safer, able to make compliant contact with surfaces rather than shredding them like a rotor system does. But flapping wing have their challenges too: Making flapping-wing robots is so much more difficult than just duct taping spinning motors to a frame that, with a few exceptions, we haven’t seen nearly as much improvement as we have in more conventional drones.

In Science Robotics last week, a group of roboticists from Singapore, Australia, China, and Taiwan described a new design for a flapping-wing robot that offers enough thrust and control authority to make stable transitions between aggressive flight modes—like flipping and diving—while also being able to efficiently glide and gently land. While still more complex than a quadrotor in both hardware and software, this ornithopter’s advantages might make it worthwhile.
 
The US Army Research Laboratory says it is experimenting with reinforcement learning algorithms to control swarms of drones and autonomous vehicles to overwhelm and dominate America's enemies.

“Finding optimal guidance policies for these swarming vehicles in real-time is a key requirement for enhancing warfighters' tactical situational awareness, allowing the US Army to dominate in a contested environment,” said Dr Jemin George, a scientist at the US Army Combat Capabilities Development Command, a boffinry nerve center of the US Army.

Dr George and his colleagues developed a method to control large swarms of agents by collecting them into groups using hierarchical reinforcement learning (HRL). By shifting drone control from a centralized approach to a hierarchical design, learning time for the software was cut 80 per cent, we're told.

Crucially, it means swarms of trained, unmanned equipment can be sent to particular areas with a set of instructions, and each collective maintains formation automatically among themselves to carry out those orders. Thus, human controllers won't have to worry about individual drones and vehicles, just point the groups at particular positions on a map; the machines will have learned to figure out their positioning for themselves, and as a team go where they are ordered and work together as intended, like a combat unit.
 
Back
Top