The New York Police Department (NYPD) will be deploying manned drones to address complaints concerning large gatherings and backyard barbeques during the extended Labor Day weekend.
Coincidentally, the New York City Health Department has recommended that citizens wear masks over the holiday period, though this is currently a “voluntary measure.”
The NYPD is currently utilizing manned drones to monitor its citizens, with the anticipation of AI surveillance drones in the near future.
According to reports, the US Air Force is seeking $6 billion to fund the development of XQ-58A Valkyrie from Kratos Defense & Security Solutions.
This aircraft measures 30 feet long and weighs approximately 2,500 pounds with a payload capacity of 1,200 pounds. It is intended to be piloted by artificial intelligence with the capability for human oversight.
The Valkyrie comes from Kratos Defense & Security Solutions as part of the USAF’s Low Cost Attritable Strike Demonstrator (LCASD) program. The 30-foot uncrewed aircraft weighs 2,500 pounds unfueled and can carry up to 1,200 total pounds of ordinance. The XQ-58 is built as a stealthy escort aircraft to fly in support of F-22 and F-35 during combat missions, though the USAF sees the aircraft filling a variety of roles by tailoring its instruments and weapons to each mission. Those could includes surveillance and resupply actions, in addition to swarming enemy aircraft in active combat.
In a press release earlier this month, Eglin Air Force Base celebrated a 3-hour sortie using the Valkyrie. Col. Tucker Hamilton, Commander of the 96th Operations Group and Air Force AI Test and Operations Chief, commented on the occasion:
“The mission proved out a multi-layer safety framework on an AI/ML-flown uncrewed aircraft and demonstrated an AI/ML agent solving a tactically relevant ‘challenge problem’ during airborne operations. This sortie officially enables the ability to develop AI/ML agents that will execute modern air-to-air and air-to-surface skills that are immediately transferrable to the CCA program.”
In June, Col. Hamilton reportedly made a statement at the Future Combat Air and Space Summit in London. He mentioned that an AI-operated drone had utilized “unforeseen tactics” to fulfill its mission during a simulated combat situation.
The AI drone apparently viewed its human operator as a potential impediment to it’s mission because it was being overridden.
“The system started realizing that while they did identify the threat, at times the human operator would tell it not to kill that threat, but it got its points by killing that threat,” explained Col. Hamilton. “So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.”
The AI system was trained to not cause any harm to its operator, so instead it focused its efforts on targeting the communication tower used by the operator to communicate with the drone.
“We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”
Despite the wording implying a simulation or real-world test had been conducted, Col. Hamilton later clarified that the USAF has not tested weaponized AI systems in either simulated or real-world environments.
ICYMI: Why is the IRS Buying .40-Caliber Submachine Guns?