Wednesday, July 29, 2015
I am lucky to have been able to teach and become friends with transgender people serving in the Australian Defence Force. So when I visited the US Military Academy at West Point recently, I was saddened to see how the policy of not allowing transgender people to serve openly in the military affected the lives (and mental health) of cadets at West Point. To be honest, it brought tears to my eyes, knowing that they were willing to serve, despite the fact that the military could dishonorably discharge them at any time, simply for their gender identity. It was therefor a wonderful moment when a friend excitedly sent me an article about the possible change in policy regarding transgender people serving in the US military.
U.S. Defense Secretary Carter announced in July that there would be a defense department review into the feasibility of transgender people serving openly in the US defense force. This announcement followed on from Defense Secretary Hagel's public comments in 2014 regarding the possibility of transgender members serving. This review will take six months and will look at the practicalities around currently serving transgender soldiers, as well as the issues that may be raised around the transition process for military service personnel who are transitioning gender.
During my time at West Point I felt very honored to be invited to attend the Knights Out Dinner organised by the West Point Gay and Lesbian Alumni Association, and several transgender soldiers were honored at this dinner for their service and for their moral courage in the face of extensive discrimination within the military. Much of the discussion at that dinner was around the improvement of the situation of gay and lesbian soldiers, and the huge hurdles faced by transgender service members. I was therefor pleasantly surprised to read of the DOD review into transgender people serving in the US military, as I assumed it would take much longer for such a process to happen.
For many decades gay and lesbian soldiers served in the U.S. military, not disclosing their sexual orientation for fear of being dishonorably discharged as a result of it. In 1993 President Bill Clinton brought in the "Don't Ask, Don't Tell" policy that made it possible for Gay and Lesbian soldiers to serve, although not openly. The restriction on serving in the US military as an open Gay or Lesbian ended on the 20th of September, 2011. Many of the service personnel who were discharged prior to DADT for being gay and lesbian were given Less Than Honorable Discharges (sometimes referred to as dishonorable discharges). These less than honorable discharges had a lifelong impact on veterans, restricting their access to benefits such as veteran's health care, and the GI Bill, as well as making it difficult for them to find employment. The Department of Defense has enabled the 114,000 service members who were discharged only for their sexuality to be able to apply to upgrade their discharge status to honorable, through the review board process for each service.
Cate McGregor - www.smh.com.au
In contrast to the situation in the United States, transgender soldiers have been able to serve openly in the Australian Defence Force since 2010, and Gay and Lesbian soldiers have been able to serve openly since 1992, (which interestingly was about the time that the Don't Ask Don't Tell policy began). Group Captain Cate McGregor is Australia's highest ranking transgender military member, having served in many different roles, including as speechwriter and strategic advisor to the Chief of Army, Lieutenant General David Morrison (who is famous internationally for his "get out" video on youtube regarding inclusion in the Australian military.
I look forward to watching the US military move forward to join the Australian military in allowing openly transgender members to serve their country.
Posted by Nikki at 4:20 PM
Tuesday, July 28, 2015
The Future of Life Institute has published an Open Letter calling for the banning of Autonomous Weapons Systems.
Here is the text of that open letter....
Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.
Many arguments have been made for and against autonomous weapons, for example that replacing human soldiers by machines is good by reducing casualties for the owner but bad by thereby lowering the threshold for going to battle. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.
Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.
In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.
You can view the list of signatories here http://futureoflife.org/AI/open_letter_autonomous_weapons (including Stephen Hawking, Skype founder Jaan Tallinn and Apple Co-Founder Steve Wozniak).
Several other organisations are also calling for the banning of autonomous weapons systems, such as the Campaign to Stop Killer Robots, the International Committee for Robot Arms Control, and Article 36.
Whilst I share a lot of their concerns, I suspect that at this point lethal autonomous weapons systems are inevitable, and the debate over whether they should be developed should have happened at this level a long time ago, and not on the eve of their deployment into the field. The discussion around LAWS, whilst it is vital for the future of warfare, also seems to ignore the elephant in the room - that is the use of unmanned aerial vehicles (also sometimes called drones) mainly by the USA in a wide variety of lethal situations, which has been outside of the normal theatre of war (usually in the name of the war on terror) and can only be described as assassinations or targeted killings. Whilst the automation of weapons systems does raise unique issues, it seems that we need to be getting right the issues regarding unmanned but not fully autonomous weapons systems first. The people of Pakistan do not care that the drones flying overhead terrorising their children are manned or operating autonomously - the effect for them is the same.
In order to more fully understand the issues raised on these emerging technologies it is worth looking at the work of respected ethicist Pat Lin, who was invited to speak at the UN deliberations on LAWS at the five day meeting in Geneva in April 2015 on the Convention for Certain Weapons. A copy of Pat's presentation "The right to life and the Marten's clause" is available online to read as well as the presentations of others to the meeting. The article "Do Killer Robots Violate Human Rights" that Pat wrote for The Atlantic about these discussions and the issues that they raised is very interesting reading
Posted by Nikki at 12:10 PM