UAV / UCAV / LAR (robotit) Uutiset ja jutut

Pikku pätkä koodia ja tuo osaa napata häirikköä kurkusta kiinni... Sitten kuvantunnistus ja lämpökamera seuraamaan väkijoukkoa, että ken hautoo häijyyksiä ja automaatti poimii tyypit pois ihmisvirrasta...

Tälläisellä metodilla vain lietsotaan sitä robottivihaa.
 
Entä kun robotti keksii että häntä kohdellaan kuin paskaa? Onko ainoa tapa voittaa tekoäly, toinen tekoäly?

Ensin robotin täyttyy ymmärtää mikä on tunne, ja sen jälkeen sen tulisi analysoida miten siihen tunteeseen pitäisi reakoida ja tällä hetkellä tämä prosessi on iso asia, mihin ei ole ratkaisua.
 

Underwater gliders, propeller-driven submersibles, and other marine robots are increasingly being tasked with gathering information (e.g., in environmental monitoring, offshore inspection, and coastal surveillance scenarios). However, in most of these scenarios, human operators must carefully plan the mission to ensure completion of the task. Strict human oversight not only makes such deployments expensive and time consuming but also makes some tasks impossible due to the requirement for heavy cognitive loads or reliable communication between the operator and the vehicle.

We can mitigate these limitations by making the robotic information gatherers semi-autonomous, where the human provides high-level input to the system and the vehicle fills in the details on how to execute the plan. These capabilities increase the tolerance for operator neglect, reduce deployment cost, and open up new domains for information gathering. In this talk, I will show how a general framework that unifies information theoretic optimization and physical motion planning makes semi-autonomous information gathering feasible in marine environments. I will leverage techniques from stochastic motion planning, adaptive decision making, and deep learning to provide scalable solutions in a diverse set of applications such as underwater inspection, ocean search, and ecological monitoring.

The techniques discussed here make it possible for autonomous marine robots to “go where no one has gone before,” allowing for information gathering in environments previously outside the reach of human divers.
https://www.ri.cmu.edu/ri-seminar-series/
 
P-1HH_HammerHead_Unmanned_Aerial_System_to_enter_in_service_with_UAE_armed_forces_925_001.jpg


At UMEX 2018, the International Unmanned Systems Exhibition in Abu Dhabi (UAE), the Italian Company Piaggio Aerospace showcases its latest development of its P.1HH HammerHead, a state-of-the-art Unmanned Aerial System (UAS) designed for Intelligence, Surveillance and Reconnaissance (ISR) missions. The HammerHead UAS was developed from the design of the P.180 Avanti II commercial aircraft.

The P.1HH HammerHead can be fitted with communications intelligence (COMINT), electronics intelligence (ELINT) and signals intelligence (SIGINT) offering the capciti to perform different types of missions. The mission management system (MMS) installed aboard the P.1HH HammerHead helps it to perform patrol and ISR (Intelligence, Surveillance and Reconnaissance) missions.

The P.1HH HammerHead has a span of 15.6m, length of 14.4m and overall height of 3.98m. The maximum take-off weight of the UAS is 6,146kg. The wing area of the P.1HH HammerHead is 18m², horizontal tail area is 3.83m², vertical tail area is 4.73m² and forward wing area is 1.3m².

The core of the UAV is the Selex ES SkyISTAR Mission Management System (MMS), coupled with the firm’s Vehicle Control Management System (VCMS) that commands the aerodynamic control surfaces and manages the on-board equipment. VCMS LRUs are installed inside the large volume fuselage, spaced for temperature control as well as survivability. Selex ES also supplies remote-piloting Ground Control Station (GCS), and UAS datalink and communications systems that can work beyond line of sight (BLOS). The UAV and GCS are NATO STANAG USAR 4671 compliant, which will help streamline approval to fly in the airspace of other countries that have ratified this UAV airworthiness standard.
https://www.armyrecognition.com/ume...o_enter_in_service_with_uae_armed_forces.html
 

With support from the National Science Foundation (NSF), roboticist Robin Murphy of Texas A&M and her colleagues are developing some upgrades to make EMILY and other rescue robots "smarter" for large-scale water rescues, such as coming to the aid of a capsized ferry or water taxi. Among other things, the researchers are working with tethered drones to create an "eye in the sky" combined with onboard thermal sensing to autonomously navigate EMILY to a cluster of people. Because the drone is tethered, no one must "staff" it during rescue operations and it remains clear of any participating aircraft.
https://www.nsf.gov/news/special_reports/science_nation/waterrescuerobots.jsp

https://vahana.aero/alpha-one-takes-to-the-skies-bc636f3a5b83

http://www.businessinsider.com/airbus-vahana-aircraft-cost-same-taxi-zach-lovering-2017-5
 
In a new study, researchers used 3D printing and low-cost parts to create an inexpensive hyperspectral imager that is light enough to use onboard drones. They offer a recipe for creating these imagers, which could make the traditionally expensive analytical technique more widely accessible.

Hyperspectral imagers produce images like a traditional color camera but detect several hundred colors instead of the three detected by normal cameras. Each pixel of a hyperspectral image contains information covering the entire visible spectrum, providing data that can be used, for example, to automatically detect and sort objects or measure ocean color to map harmful algae blooms. Traditional hyperspectral imagers can cost tens of thousands of dollars and are very bulky and heavy.

In The Optical Society (OSA) journal Optics Express, the researchers detail how to make visible-wavelength hyperspectral imagers weighing less than half a pound for as little $700 (USD). They also demonstrate that these imagers can acquire spectral data from aboard a drone.

"The instruments we made can be used very effectively on a drone or unmanned vehicle to acquire spectral images," said research team leader Fred Sigernes of University Centre in Svalbard (UNIS), Norway. "This means that hyperspectral imaging could be used to map large areas of terrain, for example, without the need to hire a plane or helicopter to carry an expensive and large instrument."
http://www.spacedaily.com/reports/L...ticated_imaging_capability_to_drones_999.html
 
Venäjällä pidetään kilpailuja uusien AUV-ratkaisujen kehittämiseksi. Merialivoimainen osapuoli pyrkii innovatiivisiin ratkaisuihin. Aikoinaan kylmän sodan aikana NL:n merivoimat kehittivät erittäin innovatiivisia sukellusvene- maalittamis- ja ohjusratkaisuja - toimivuus ehkä jäi puolitiehen.

https://translate.google.com/transl...ovaniya-po-morskoi-robototekhnike/&edit-text=

Applications began for the first All-Russian student competitions in marine robotics, which will be held at the Far Eastern Federal University in September 2018. Regulations of competitions among underwater robots were approved by the organizing committee.Applications from the teams are accepted until May 30. The main organizer of the competition is the Foundation for Advanced Studies.

Student competitions are held in two directions: among autonomous and remote-controlled unmanned underwater vehicles. Teams with autonomous robots will have to perform five contest tasks: entering a given square, moving along the cable line, searching for objects and marking them, passing through the gate at a given depth and accessing the hydroacoustic beacon, the press service of the FEFU reports.

For the remote-controlled underwater robots, four tasks have been prepared. Operators who will be piloted by robots, it is necessary to detect a flooded object, find an inscription on it and read it, deliver the marker to the sunken object and attach a hose simulating the pipeline. To complete all tasks, teams are given 20 minutes, the maximum weight of the robot is 50 kg.

Teams of students and post-graduate students of Russian universities are invited to participate in competitions. The application with a brief description of the robotic complex should be sent to the FPI e-mail: [email protected] by May 30. The presentation should contain a description, purpose, basic technical characteristics of the device, simulation capabilities (simulations), design merits, and team information.

All-Russian competitions in marine robotics are held on behalf of the collegium of the Military-Industrial Commission of Russia and Deputy Prime Minister Dmitry Rogozin. As part of the tournament in the summer of 2018 in Vladivostok, competitions will also be held among employees of departments, specialized research centers of Russia and enterprises-developers. The final will take place in August in Patroclus in the framework of the International Military Technical Forum "Army-2018".

The Contest is held by the Foundation for Advanced Studies with the support of the board of the Military Industrial Commission of the Russian Federation, the Office of the Plenipotentiary Representative of the President of the Russian Federation in the Far Eastern Federal District Yury Trutnev, the Ministry of Industry and Trade, Transport, Defense, Emergencies, Education and Science, Rosgvardia, United Shipbuilding Corporation "," Rosatom ", Tactical Missile Arms Corporation, Far Eastern Federal University, Maritime State th University named after GI Admiral Nevelskogo and other organizations.
 
Eleet ja niihin suhtautuminen on tulossa uusiin droneihin.

AnyMal tanssii musiikin tahtiin

 
A study led by researchers at Tokyo Institute of Technology (Tokyo Tech) has uncovered new ways of driving multi-legged robots by means of a two-level controller. The proposed controller uses a network of so-called non-linear oscillators that enables the generation of diverse gaits and postures, which are specified by only a few high-level parameters. The study inspires new research into how multi-legged robots can be controlled, including in the future using brain-computer interfaces.

In the natural world, many species can walk over slopes and irregular surfaces, reaching places inaccessible even to the most advanced rover robots. It remains a mystery how complex movements are handled so seamlessly by even the tiniest creatures.

What we do know is that even the simplest brains contain pattern-generator circuits (CPGs), which are wired up specifically for generating walking patterns. Attempts to replicate such circuits artificially have so far had limited success, due poor flexibility.

Now, researchers in Japan and Italy propose a new approach to walking pattern generation, based on a hierarchical network of electronic oscillators arranged over two levels, which they have demonstrated using an ant-like hexapod robot. The achievement opens new avenues for the control of legged robots.
http://www.spacedaily.com/reports/Tokyo_Techs_six_legged_robots_get_closer_to_nature_999.html

http://ieeexplore.ieee.org/document/8270661/
 
Back
Top