Abstract
In this paper I argue in favour of the adoption of an interdisciplinary approach based on computational methods for the development of language policies. As a consequence of large-scale phenomena such as globalization, economic and political integration and the progress in information and communication technologies, social systems have become increasingly interconnected. Language-related systems are no exception. Besides, language matters are never just language matters. Their causes and consequences are to be found in many seemingly unrelated fields. Therefore, we can no longer overlook the numerous variables involved in the unfolding of linguistic and sociolinguistic phenomena if we wish to develop effective language policy measures. A genuinely interdisciplinary approach is key to address language matters (as well as many other public policy matters). In this regard, the tools of complexity theory, such as computational methods based on computer simulations, have proved useful in other fields of public policy.
Highlights
As a consequence of large-scale phenomena such as globalization, social integration, migrations, and progress in information and communication technology, the world has become a much more complex place than it used to be
In this paper I provide a quick review of complexity theory, an approach developed for the study of complex phenomena, and I put it in relation with language policy
Throughout this paper I tried to show that computational methods can play an important role in the field of language policy
Summary
As a consequence of large-scale phenomena such as globalization, social integration, migrations, and progress in information and communication technology, the world has become a much more complex place than it used to be. Policy interventions in a complex environment are often effective only if they are designed with a degree of complexity that matches that of the issue they are addressing (Bar-Yam, 2015), lest societies collapse under a level of complexity that is no longer sustainable (Tainter, 1988). In this paper I provide a quick review of complexity theory, an approach developed for the study of complex phenomena, and I put it in relation with language policy. It places itself in direct opposition to the philosophical position of “reductionism”, which supports the idea that all processes and phenomena can be reduced to a collection of simpler basic parts This does not amount to saying that complexity theory rules out the possibility of deducing larger macro-dynamics from individual micro-cases, quite the opposite. I discuss the role that computational methods can play in language policy making
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.