Intelligent machines and artificial intelligence in the form of software are becoming so widespread now that it is time to think about the delegation of responsibility and accountability to non-human actors that these technologies bring. What this means for the risk assessment of such technologies and what this would mean for regulatory approaches is an important aspect. From warfare1 to our financial systems2 down to everyday tasks3 and social responsibilities4 machine autonomy is having …show more content…
As a society, we are already relying on a large amount of delegation to non-human actors such as these intelligent machines and software. Therefore it is important to understand the technological capabilities and opportunities as well as the potential risks and conflicts of interest that certain imaginations and practises of machine autonomy can bring with them. The aim of this thesis is to present an overview of the current debate around machine autonomy and to present a case for the regulation of machine autonomy on an international level. The field of Science/Technology&Society provides a useful framework for analysing the underlying mechanisms of agency and the move towards non-human agency, the different …show more content…
Tasks can range from mundane or dangerous things such as disaster relief and clean-up to more demanding tasks such as security and patrolling up to very complex tasks such as full on warfare. The delegation of accountability away from human towards non-human actors is another important aspect of this thesis. The question if a regulation of machine autonomy is feasible will serve to illustrate some of the aspects that such regulation would bring up. A non-human actor can be defined as any non-human entity capable of taking action and making decisions which can also include software, algorithms, objects and machines. When applying the basic idea of the sociotechnical imaginary, the question "how do we want to live in the future?" to the topic of this thesis questions about accountability, risk and responsibility become even more apparent. Throughout the centuries our societies have developed elaborate rules and laws concerning the delegation of responsibility and therefore the acceptance of accountability to other human actors or human institutions. The risks of delegating tasks to other human beings is understood and accepted, regulation and lawmaking provide the framework which allows for the transfer of responsibility and accountability without risking ones own legal security. If, as some imaginations of machine autonomy suggest, technological