Home
Contact Us
Donate
Support Us
Store
Books
Videos
Global Political Awakening and the New World Order
Quotes
Translate
GPA Store: Featured Products
Saturday, September 8, 2012
Defense Science Board report: DoD should ‘more aggressively use autonomy in military missions’
Image credit: ACQ.OSD.mil
Madison Ruppert,
Contributor
In a recently released
report
made publicly available by Steven Aftergood of the
Federation of American Scientists
entitled, “The Role of Autonomy in DoD Systems,” the Defense Science Board (DSB) sets forth recommendations for an increased use of autonomous systems in drones and other unmanned systems utilized in Department of Defense (DoD) missions.
Aftergood brings out some of the more disturbing details in the report
including the Board recommending the DoD “more aggressively use autonomy in military missions.” Thankfully autonomous doesn’t mean completely autonomous just yet, or as Aftergood humorously puts it, “The Board is not calling for the immediate development of
Skynet
at this time.”
This should indeed be quite concerning for Americans considering that we’re seeing a
massive expansion of the use of military drones in concert with law enforcement in the United States
, not to mention a
plethora of potential or current drone bases across the nation
. Indeed, the use of drones is becoming so appealing even
television news networks may have their own drone fleets in the near future
.
The report, produced by the DSB which describes itself as “a Federal Advisory Committee established to provide independent advice to the Secretary of Defense,” is dated July 19, 2012 and is unclassified. The DSB Task Force on The Role of Autonomy in DoD Systems actually finished the information gathering process for the report in October of last year.
“It should be made clear that all autonomous systems are supervised by human operators at some level,” the report states. However, a problem highlighted by the report — one which has
also been raised by the Air Force
— is the sheer amount of data constantly pouring in from unmanned platforms around the globe and relative lack of analysts to process it.
This problem seems to be at least partially behind the push towards increased autonomy since autonomous systems “can enable humans to delegate those tasks that are more effectively done by computer… thus freeing humans to focus on more complex decision making,” according to the report.
“However, the true value of these systems is not to provide a direct human replacement, but rather to extend and complement human capability by providing potentially unlimited persistent capabilities, reducing human exposure to life threatening tasks, and with proper design, reducing the high cognitive load currently placed on operators/supervisors,” continues the report.
Yet this seems to be raising even more problems including a so-called “brittleness” which can result in irreversible errors.
“Current designs of autonomous systems, and current design methods for increasing autonomy, can create brittle platforms,” the report states, adding that “new failure paths associated with more autonomous platforms, which has been seen in friendly fire fatalities. [...] This brittleness, which is resident in many current designs, has severely retarded the potential benefits that could be obtained by using advances in autonomy.”
One must wonder, why the push towards increasingly aggressive use of autonomy when we know that “new failure paths” have been created “which has been seen in friendly fire fatalities”?
According to the report, other nations are heavily engaged in the research and use of autonomous unmanned systems, including China, has been investing “alarming” amounts of money into the field.
The authors of the report also encourage an increased focus on the vulnerabilities in unmanned systems, something which has been sidelined and minimized quite a bit, even when
our own drone fleet has been infected by a virus
.
Aftergood points out that the report “includes some intriguing citations” including a work entitled, “
Governing Lethal Behavior in Autonomous Robots
” (note: the work is hosted on Google Books).
Other observations in the report include some stunnng numbers about the sheer amount of data being sent to the National Geospatial Intelligence Agency (NGA), although that is far from the only government entity suffering from the problem, as indicated above.
“Big data has evolved as a major problem at the National Geospatial Intelligence Agency (NGA),” states the report. “Over 25 million minutes of full motion video are stored at NGA.”
The new sensors being developed, including the camera capable of capturing a stunning 36 square miles in a single blink, are going to make this problem even worse.
Furthermore, the military has not stopped devoting funds to the research and development of new sensors. The
DoD announced a new award on August 3, 2012
for a whopping $23.93 million to Raytheon for, “an airborne, electro-optic, forward-looking infra-red, turreted sensor package that provides long-range surveillance, high altitude target acquisition, tracking, range-finding, and laser designation, and for all tri-service and NATO laser guided munitions.”
With the United States being broke, some might be surprised that sums this large continue to be devoted to drone cameras, however, with Raytheon being among the
American war profiteers
, I would be shocked if this kind of money wasn’t constantly being funelled into their coffers.
The amount of data which will soon be inundating military analysts is so large that any hope of actually combing through it all is likely misplaced.
“Today nineteen analysts are required per UAV orbit [i.e. per 24 hour operational cycle]. With the advent of Gorgon Stare,
ARGUS
, and other Broad Area Sensors, up to 2,000 analysts will be required per orbit,” states the report.
The report highlights just how problematic this is in writing that the government simply “can’t hire enough analysts or buy enough equipment to close these gaps.”
This problem very well might be ameliorated, or at least appear to be, by the increased use of autonomous systems. However, one must question if the risks are outweighed by the benefits since such technology could very well lead to the deaths of innocent people.
There are already
enough civilians being killed by drones with humans operating them
, are we really going to trust automated systems when it comes to operating these high-flying weapons platforms? Personally, I side with
the applied ethicist who recently wrote a paper encouraging engineers to outright refuse to work on any military robots
. Unfortunately it
doesn’t look like many
are
following his advice just yet
.
Please support our work and help us start to pay contributors by doing your shopping through our
Amazon link
or check out some must-have products at our
store
.
This article first appeared at
End the Lie
.
Madison Ruppert is the Editor and Owner-Operator of the alternative news and analysis database
End The Lie
and has no affiliation with any NGO, political party, economic school, or other organization/cause. He is available for podcast and radio interviews. Madison also now has his own radio show on Orion Talk Radio from 8 pm -- 10 pm Pacific, which you can find
HERE
. If you have questions, comments, or corrections feel free to contact him at
admin@EndtheLie.com
Enter your email address to subscribe to our newsletter:
Delivered by
FeedBurner
Be the Change! Share this using the tools below. Sharing on Reddit and Newsvine will help the most.
0 Comments
Disqus
Fb Comments
[Get It]
Comments :
Newer Post
Older Post
Home