After the COVID-19 pandemic stopped many asylum procedures throughout Europe, fresh technologies have become reviving these types of systems. Right from lie diagnosis tools analyzed at the line to a system for validating documents and transcribes interviews, a wide range of solutions is being found in asylum applications. This article is exploring how these systems have reshaped the ways asylum procedures happen to be conducted. It reveals how asylum seekers will be transformed into obligated hindered techno-users: They are asked to adhere to a series of techno-bureaucratic steps and keep up with unpredictable tiny within criteria and deadlines. This kind of obstructs their capacity to navigate these systems and to go after their right for proper protection.
It also shows how these technologies are embedded in refugee governance: They help in the ‘circuits of financial-humanitarianism’ that function through a whirlwind of spread technological requirements. These requirements increase asylum seekers’ socio-legal precarity by hindering all of them from opening the programs of protection. It further states that analyses of securitization and victimization should be coupled with an insight into the disciplinary mechanisms these technologies, in which migrants happen to be turned into data-generating subjects who have are regimented by their reliability on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal know-how, the article states that these technology have an natural obstructiveness. They have a double impact: www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students whilst they help to expedite the asylum method, they also generate it difficult designed for refugees to navigate these kinds of systems. They are really positioned in a ‘knowledge deficit’ that makes them vulnerable to bogus decisions manufactured by non-governmental stars, and ill-informed and unreliable narratives about their situations. Moreover, that they pose new risks of’machine mistakes’ which may result in erroneous or discriminatory outcomes.