Published on in Vol 3 (2024)

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/56665, first published .
Ethics and Governance of Neurotechnology in Africa: Lessons From AI

Ethics and Governance of Neurotechnology in Africa: Lessons From AI

Ethics and Governance of Neurotechnology in Africa: Lessons From AI

Authors of this article:

Damian Eke1 Author Orcid Image

Viewpoint

School of Computer Science, University of Nottingham, Nottingham, United Kingdom

Corresponding Author:

Damian Eke

School of Computer Science

University of Nottingham

Wollaton Rd, Lenton, Nottingham

Nottingham, NG8 1BB

United Kingdom

Email: damian.eke@nottingham.ac.uk


As a novel technology frontier, neurotechnology is revolutionizing our perceptions of the brain and nervous system. With growing private and public investments, a thriving ecosystem of direct-to-consumer neurotechnologies has also emerged. These technologies are increasingly being introduced in many parts of the world, including Africa. However, as the use of this technology expands, neuroethics and ethics of emerging technology scholars are bringing attention to the critical concerns it raises. These concerns are largely not new but are uniquely amplified by the novelty of technology. They include ethical and legal issues such as privacy, human rights, human identity, bias, autonomy, and safety, which are part of the artificial intelligence ethics discourse. Most importantly, there is an obvious lack of regulatory oversight and a dearth of literature on the consideration of contextual ethical principles in the design and application of neurotechnology in Africa. This paper highlights lessons African stakeholders need to learn from the ethics and governance of artificial intelligence to ensure the design of ethically responsible and socially acceptable neurotechnology in and for Africa.

JMIR Neurotech 2024;3:e56665

doi:10.2196/56665

Keywords



The increasing convergence of neuroscience, engineering, materials science, and emerging technologies, such as artificial intelligence (AI), robotics, extended reality, and so on, has given rise to a novel technology frontier called neurotechnology. Once dismissed as the stuff of fiction, cutting-edge invasive and noninvasive neurotechnology is now becoming the reality of our time. Significantly intertwined with advancements in AI, machine learning, and deep learning, neurotechnology holds tremendous promise for research and practice. From brain imaging devices that have transformed our understanding of brain structures and functions to neuromodulation and neurostimulation devices that improve the quality of life of people with brain disorders, neurotechnology is revolutionizing our perceptions of the brain and nervous system. It is also becoming a booming industry with growing private and public investments in direct-to-consumer neurotechnologies [1]. This is owing to a number of factors, including the increasing prevalence of brain diseases and the increasing integration of innovation into biomedical research and practice.

In the last decade, many countries and regions have recognized the need for a maintained public investment in brain research as a priority. Some of the publicly funded large-scale brain research projects include the US Brain Initiative (US $3 billion), the EU Human Brain Project (€607 million), China Brain Project (US $746 million), Japan BRAIN/MINDS (US $365,163,41), Australian Brain Alliance (US $500 million), Canadian Brain Research Strategy (US $250.3 million), and Korea Brain Initiative (US $1.2 billion) [2]. These projects and the emerging landscape of public and private funding opportunities have created a global ecosystem where countries in Africa and other developing countries in the Global South are being left behind in brain research and innovation. Public funding for neuroscience research and innovation is almost nonexistent in Africa [3]. Neurotechnology research and innovation are practically being developed in select countries in North America, Europe, and Asia. These are tech devices that are currently being introduced in African contexts and to African consumers. However, as it has been identified in other emerging technologies, such as AI, technology is developed with a specific context in mind and reflects the dynamics of that context. The use of the technology beyond the context it was created for raises concerns, including bias and discrimination. This means that the use of neurotechnology devices in Africa holds potential ethical and legal risks to both individuals and society.

As this technology expands, neuroethics and ethics of emerging technology scholars are bringing attention to the critical concerns it raises [4-6]. These concerns are generally not new, but uniquely amplified by the novelty of technology. They include issues around privacy, rights, human identity, autonomy, and safety. Many of these are already part of the ethics of emerging technology discourse, particularly AI ethics. The unique risks this technology raises have led to calls for adequate regulatory oversight. Global discussions in this regard have gained momentum in the last decade with a focus on ensuring responsible and equitable design and deployment of this disruptive technology. Intergovernmental bodies, such as the Organization for Economic Cooperation and Development (OECD), UNESCO (United Nations Educational, Scientific and Cultural Organization), and the Council of Europe, are playing major roles in this regard. Chile has become the first country to implement legal measures that address the risks of neurotechnologies [7]. The Chilean constitution was recently amended to legally protect citizen’s mental privacy and free will.

However, as more and more countries discuss the ethics and governance of neurotechnology, there is an obvious lack of regulatory oversight and a dearth of literature on the consideration of contextual ethical principles in the design and application of neurotechnology in Africa. This scenario parallels the historical developments within the field of AI and Africa. In order to circumvent the replication of past errors, this study aims to delineate and address these pitfalls through an in-depth examination and analysis of lessons to be learned. This paper highlights prospective insights that the domain of neurotechnology ethics and governance in Africa may derive from the extant body of literature on responsible AI. Stakeholders, such as neurotechnology developers, policy makers, and academic researchers, stand to gain valuable perspectives from these insights. The resultant impact is anticipated to contribute significantly to the responsible design and implementation of neurotechnology devices in Africa. This encompasses the establishment of robust policies and regulations, as well as the provision of guidance for academic discourse within this domain. This paper starts by providing a conceptual clarification of neurotechnology. Then it discusses why Africa should care about neurotechnology and what responsible neurotechnology will mean for Africa. Lessons are then drawn from the available literature. It is important to note that African societies are not considered as monolithic. However, Africa is taken as a continent with common sociocultural values and challenges relevant to the ethics and governance of neurotechnology.


The human brain remains the most complex organ in the human body largely due to its intricate structure and its central role in coordinating all functions and activities of the body. There are many factors that contribute to this exceptional complexity, including its adaptability and plasticity, cognitive abilities, neural network, sensory processing capabilities, motor control, homeostasis, consciousness, genetic complexity, metabolic demand, multilayered structure, and infinite variability. According to Jorgenson et al [8], the goal of comprehensively understanding all these factors “remains elusive, although not from a lack of collective drive or intellectual curiosity on the part of researchers. Rather, progress frequently has been limited by the technologies available during any given era.” Neurotechnology has emerged to provide a greater understanding of the brain and offer solutions to previously understudied brain disorders.

UNESCO defines neurotechnology as a “field of devices and procedures used to access, monitor, investigate, assess, manipulate, or emulate the structure and function of the neural systems of animals or human beings” [9]. It involves the application of engineering principles to the “understanding, engagement, and repair of the human nervous system” [10]. Neurotechnology refers to a set of technologies, rather than a specific technology, that enables direct connection or interface between technical components with the nervous system [4,11]. This interaction with the brain and nervous system raises a variety of ethical and legal risks, necessitating the discussion of the ethics of neurotechnology.

The importance of ethics in neurotechnology is demonstrated in UNESCO’s call for solid governance of neurotechnology design and deployment at the last international conference on the ethics of neurotechnology on the theme “Towards an Ethical Framework in the Protection and Promotion of Human Rights and Fundamental Freedoms” [12]. From noninvasive technologies used to study the brain to wearable or implantable devices, neurotechnology is opening new possibilities to study the nervous system and help diagnose, treat, and prevent brain-related diseases [1,13]. These technologies are currently being developed for diverse uses in clinical and research settings, as well as for everyday life, workplace well-being, and education. In research, the development of neurosensing technologies, such as magnetic resonance imaging (MRI), functional MRI, electroencephalography, magnetoencephalography, positron emission tomography, functional near-infrared spectroscopy, single-photon emission computed tomography, biomarker analysis tools, and invasive intracranial electrodes, has been transformative for studying the brain. These technologies are fundamental for advancing the understanding of the brain because they provide valuable insights into brain function, neurophysiology, and neurological disorders.

Beyond neurosensing, neurotechnology is also being developed for other purposes, including neurostimulation, neuroprostheses, and neurorehabilitation. Neurostimulation technological devices have shown potential in trials to provide therapeutic relief for a number of brain-related disorders, such as epilepsy [14], chronic pain [15], Parkinson disease (Schuepbach et al [16]), obsessive-compulsive disorder [17], depression [18], addiction disorders [19], spinal cord injuries [20], and Tourette syndrome [21]. Neurotechnology encompasses invasive and noninvasive devices that serve as motor [22] or sensory [23] neuroprostheses. These devices work by connecting to or bypassing any damaged neural pathways to restore function or enhance communication in people with stroke [24], spinal cord injuries [25], amyotrophic lateral sclerosis [26], cerebral palsy [27], and traumatic brain injuries [28]. When these technologies are designed to help individuals regain lost functional abilities, improve their quality of life, and promote independence, it is called neurorehabilitation [29]. These include noninvasive brain-machine interfaces for patients with stroke [30] to noninvasive interfaces, such as brain-actuated robotic devices designed to restore the independence of people experiencing severe motor disabilities [31].

Indeed, there is an emerging and thriving neurotechnology industry; an industry that reached a market value of US $11.3 billion in 2021 and is projected to exceed US $24.2 billion in 2027, with an estimated annual growth rate of 14.4% [32]. According to a recent UNESCO report, “the United States emerges as the main place where neurotechnology-related innovations are generated (47% of worldwide IP5 patent applications in neurotechnology), followed by Korea (11%), China (10%), Japan (7%), Germany (7%), and France (5%)” [1]. It is further revealed that out of 1400 identified neurotechnology companies, 50% are based in the United States, 35% in Europe and the United Kingdom, and the rest in Asia [1]. These figures are not surprising, given the level of public funding that brain research and innovation are receiving in the Global North. While investments in neurosensing, neurostimulation, neuroprostheses, and neurorehabilitation are different in size, volume, and level of maturity (neurosensing being the most mature and with the highest investments), there are exponential increases in all aspects of neurotechnology.


Globally, the burden of neurological disorders has increased significantly over the past 2 decades, with a significant increase in mortality (36.7%) and disability (7.4%) rates [33]. There is also evidence from literature to show that there is a large geographical variation in the burden of these disorders [33-37]. There is consistent evidence that there is an increasing prevalence of neurological diseases in Africa, putting huge burdens on public health systems [38-40]. Some of these brain-related diseases include stroke, dementia, Parkinson disease, epilepsy, migraine, medication overuse headache, motor neuron disease, cerebral palsy, brain development disorders, peripheral neuropathy, trauma, alcohol-related brain damage, nervous system complications of HIV/AIDS, brain and nervous system cancers, and multiple sclerosis. It also includes psychiatric disorders and mental health diseases, such as depression, anxiety, schizophrenia, psychosis, bipolar, stress, and other behavioral disorders [41,42]. Many of these are preventable (eg, some developmental disorders and strokes) and others are possibly treatable with novel technologies and therapies.

Silberberg and Katabira [43] believe that the increasing prevalence and burden are the results of factors such as “adverse perinatal conditions, malaria, HIV/AIDS and other causes of encephalitis and meningitis, demographic transitions, increased vehicular traffic, and persistent regional conflicts.” In addition to these, there are other factors that increase the impact of neurological disorders in Africa, such as sociocultural and religious beliefs, stigma and discrimination, lack of quality therapies or treatment, and the absence of organized public sector response [40]. Neurological disorders are often neglected or comprehensively ignored in most African societies [44]. Patients in Africa face challenges related to a lack of health care infrastructure and access to specialized services. Many strongly believe that there is no available treatment or therapy for neurological disorders.

Neurotechnology provides hope to African societies struggling with the burden of neurological disorders. It can help bridge the gap in access to neurological care in many underserved communities. They can help with early diagnosis through advanced neuroimaging and diagnostic tools (eg, portable and low-cost electroencephalography machines). Neurorehabilitation tools, such as virtual reality–based therapies [45,46], functional electrical stimulation [47], and telerehabilitation platforms [48], can provide patients access to rehabilitation services in areas with a lack of resources for therapy. A number of neurostimulation devices may possibly provide effective treatment options for patients with epilepsy, while remote computer-based therapies offer possible relief for a number of brain diseases. There are potentially significant opportunities for neurotechnology to have a positive impact on neurological diagnosis, treatment, and rehabilitation in Africa. However, challenges related to cost and infrastructure remain and will need to be overcome.

In addition to clinical support, neurotechnology can also strengthen research in Africa to better understand the epidemiology of neurological disorders, and factors contributing to their prevalence in Africa, as well as improve the global knowledge of the human brain and nervous system. The introduction of neuroimaging data, generated and processed in Africa, can contribute immensely to the global understanding of the brain and its diseases given the genetic diversity in the continent. It implies that brain diseases plaguing the African population will be better understood, raising the likelihood of developing suitable therapies for them. Neurotechnology can also be applied in marketing and consumer research in Africa, especially in the emerging field of neuromarketing. From product testing, pricing, and value perception to emotion analysis and branding, neurotechnology can help companies understand and influence consumer behavior, preferences, and decision-making. Although far-fetched, this can help African businesses to become more competitive in the global market.

Despite the abovementioned potential benefits of neurotechnology, there are also many good reasons for Africa to prioritize other goals and issues, such as cleaning and sanitation, food security, housing and shelter, education, and access to basic health care. This is a valid argument, and governments need to focus more on these basic needs. However, it is important to note that as a technology, neurotechnology is pervasive, becoming increasingly widespread and influential across various aspects of society. While most of these technologies are being developed outside of Africa, they will be used in Africa. In today’s interconnected world, neurotechnologies can spread rapidly across borders through various channels, such as trade, investment, collaboration, and intellectual property agreements. Therefore, the pervasive and ubiquitous nature of this technology makes it hard to neglect.

Furthermore, as UNESCO [12] has observed, the application of neurotechnology triggers a number of critical ethical and legal considerations, including, but not limited to, autonomy, privacy, mental integrity, human dignity, personal identity, and freedom of thought. Given the African sociocultural context and possibilities of using this technology for enhancement, it also raises fundamental questions related to personhood with profound implications for individuals and societies at large. There are also issues around benefit sharing, digital divide, and accessibility and safety concerns. These concerns are amplified by the fact that current neurotechnological systems are being developed with limited data from Africa without consideration of African sociocultural values and principles. For instance, a brain-computer interface that decodes continuous language from noninvasive recordings would have many useful scientific and practical applications [49], but it raises fundamental questions about the privacy of brain data. In a study that in part relies on an AI transformer model, Tang et al [49] claim to have used functional MRI to produce texts of participants’ imagined thoughts. The implications this has on privacy are novel and immensely significant in the face of an evident lack of governance mechanisms to ensure that these technologies are designed in an ethically responsible, socially acceptable, and legally compliant way. But what should responsible neurotechnology for Africa look like?


Like other emerging technologies, neurotechnology raises crucial ethical and legal challenges. However, there remains a dearth of policy frameworks and regulations to ensure the development of responsible and trustworthy neurotechnology. During the UNESCO conference on ethics of neurotechnology in Paris on July 13, 2023, participants agreed on the need for a comprehensive governance framework to harness the potential of neurotechnology and address the evident risks it presents to societies. Speaking at the conference, the Assistant Director-General for Social and Human Sciences of UNESCO declared that “…we must act now to ensure it is not misused and does not threaten our societies and democracies” [12]. In the absence of national, regional, and international principles, policies, and governance mechanisms for neurotechnology, the OECD adopted a set of recommendations on responsible innovation in neurotechnology in 2019 [50]. This is the first attempt to set an international standard that aims to guide government agencies as well as innovators to address the ethical, legal, and social challenges that neurotechnology raises. Principles encompassed in the OECD recommendation are promoting responsible innovation, prioritizing safety assessment, promoting inclusivity, fostering scientific collaboration, enabling societal deliberation, enabling capacity of oversight and advisory bodies, safeguarding personal brain data and other information, promoting cultures of stewardship and trust across the public and private sector, and anticipating and monitoring potential unintended use or misuse. While this instrument does not constitute an international treaty, it covers critical challenges and opportunities for better innovation practices through responsibility-by-design approaches. Such a governance approach is needed to protect and promote human rights and fundamental freedoms. It is an approach that requires the integration of relevant values and principles that reflects the contexts within which the technology will be applied.

So far, the discussion on ethics and governance of neurotechnology has neglected narratives, values, principles, and contexts in Africa. African datasets that can inform the design of neurotechnological systems are currently missing from available open-access platforms. Potentially, therefore, neurotechnology devices are being designed without relevant data from Africa, and the field of neuroscience and neurotechnology remains largely dominated by countries in the Global North. The question then is can responsible neurotechnology be achieved in Africa without African values, principles, data, and experts? Neurotechnology cannot be designed and developed in and for Africa without Africans, their data, and without considerations of African sociocultural contexts, needs, expectations, values, and principles. The current debate on ethics and governance of neurotechnology has taken a similar turn that AI ethics took. Therefore, as UNESCO moves to develop a global normative instrument and ethical framework similar to UNESCO’s Recommendation on the Ethics of AI, it is important for policy makers, innovators, and civil society groups to consider these lessons from AI ethics.


Neurotechnology and AI share some similarities and differences. It is common knowledge that artificial neural networks draw inspiration from the brain structure and function because they are designed with interconnected nodes that loosely mimic how brain neurons interact [51]. Both AI and neurotechnology also involve some forms of learning and adaptation. Similarly, they both have uses in health care. However, there are differences in implementation, complexity, and function. For instance, AI uses chips and programmed algorithms, and is based on mathematical models, while neurotechnology often interacts directly with biological systems (brains and nervous systems).

Brains use biological neurons with complex chemical interactions, while AI uses silicon chips and programmed algorithms. The implication of these differences and similarities is that both AI and neurotechnology share common aims and challenges, but they also demand distinct approaches, methodologies, and considerations. The convergence of the 2 can potentially lead to breakthroughs in understanding the brain as well as both biological and AI, ultimately providing benefits for society. However, attention must be paid to the risks they raise for which the discourse on ethical considerations of AI is more advanced than in neurotechnology.

In general, neurotechnology can learn valuable lessons from the evolving field of AI ethics and governance to ensure responsible design, development, and deployment. AI ethics debates highlight the need for transparency and explainability in disruptive technological systems to build trust and accountability. Fairness, equity, responsibility, justice, and autonomy are central to AI ethics. These are principles neurotechnology innovators and policy makers need to adopt to ensure that societal needs and contexts are prioritized.

In addition to these, there are also unique lessons for relevant stakeholders designing, developing, and using neurotechnology in and for Africa. These include innovators, neurotechnology industry players, users, and policy makers.

Epistemic Injustice

Neurotechnology, like AI, is a value-laden technology, but the critical question is, and should be, whose values and social contexts shape the design and development of the technology [52]? For a number of decades, AI design, development, deployment, ethics, and governance were based on Euro-American epistemic foundations. Values and principles often discussed in the context of value-sensitive design largely reflected worldviews from the Global North. Narratives, values, and principles from the Global North were mostly forgotten or ignored [53,54]. Ruttkamp-Bloem [55] argues that Africa’s exclusion in global AI debates constitutes epistemic injustice that cuts across both hermeneutic and testimonial injustice. This includes the exclusion of African academics and AI practitioners and the work they do in Africa. This lack of diversity, especially in the design and development stage, leads to the exclusion of important knowledge and perspectives from underrepresented communities. Epistemic injustice in AI manifests in different forms including increased bias in AI algorithms, exclusion and marginalization, reinforcement of stereotypes, and other unintended harms. In the context of neurotechnology, this can lead to unfair or inaccurate diagnoses or predictions. Responsible neurotechnology, particularly in Africa, ought to be based on the foundation of epistemic justice—a recognition of Africa’s unique contexts, sociocultural and ethical values, principles, and needs. While there might be a need to enhance capacities for neurotechnology design and implementation in Africa, the inclusion of African experts, data, values, contexts, and principles in the design and implementation of neurotechnology is critical to the idea of responsible innovation in neurotechnology.

Principles Alone Cannot Guarantee Responsible Neurotechnology

As the awareness of the risks of AI has risen, private and public institutions have responded with a “deluge of AI codes of ethics, frameworks, and guidelines” outlining high-level principles and values to guide ethical design, development, and implementation of AI [56,57]. Mittelstadt [56] argues that the increasing ecosystem of AI ethics has mostly produced, “vague, high-level, principles, and value statements which promise to be action-guiding, but in practice provide few specific recommendations and fail to address fundamental normative and political tensions embedded in key concepts.” The argument here is that AI policy statements and ethical principles have remained ineffective [58] and are largely ignored in many technology-based companies [59]. As Baker and Hanna [60] observed, big technology corporations’ “commitments to ethics are hollowed out by vagueness and legal hand-wringing—in practice, they’re often merely commitments to maintaining public image and mitigating future public relations disasters.” While OECD’s principles of responsible innovation in neurotechnology are laudable, these principles are not enough to ensure that innovators and users practically embed relevant values and principles into the design and implementation of neurotechnology. Responsible neurotechnology needs implementable governance mechanisms, that are ethical, legal, and technical. Building on established approaches of translating principles into practice in biomedical sciences, ethics, and governance of neurotechnology should be more robust and rigorous than what we have observed in AI ethics.

Possibilities of Ethics Dumping

Ethics and governance of these technologies help to anticipate potential risks, promote safe innovation and deployment, and prevent use that violates core values or exposes people to unacceptable risks. As AI governance has gained momentum in the Global North, Ruttkamp-Bloem [55] believes that Africa has become the ethical dumping ground of the main players on the AI technology scene because of weak regulations. Ethics dumping here refers to the practice of carrying out unethical or legally nonpermissible research activities in countries or regions with weak or nonexistent regulations or governance frameworks. In AI, there is emerging evidence of ethics dumping in the form of “health data colonialism,” in which AI researchers and developers from big technology companies collect data from developing countries to build algorithms in these countries to avoid stricter regulations in their countries [61]. Another example is the outsourcing of data labeling by OpenAI to Africa in what has been called labor exploitation and “unethical outsourcing” [62].

These are possible scenarios that can happen with neurotechnology. As countries in the Global North continue to discuss possibilities of neurorights and governance of neurotechnology, there is a likelihood that neurotechnology companies will exploit the nonexistent regulatory framework in Africa, from unethical human testing to labor exploitation. Africa needs to be aware of this and become proactive in considering governance mechanisms to guide the design, development, and deployment of neurotechnology.

Diversity of Datasets Is of Critical Importance

The diversity of datasets in the design of emerging technologies is of paramount importance. Diverse datasets offer a more accurate representation of the real world and help to ensure that the technology is fair, robust, inclusive, effective, and sustainable. Fairness, equity, and generalizability in AI mostly depend on how representative the data used to train the AI system are. Nonrepresentative datasets infuse bias into the system. It is also common knowledge that without datasets from AI, AI cannot work for Africa. The quality, quantity, and diversity of data from Africa play a crucial role in the development of accurate, reliable, and generalizable AI systems in Africa. This is the same with neurotechnology. Racially exclusionary practices have been attributed to nanotechnological devices used for neuroimaging, both in the acquisition and analyses [63-65]. Many electroencephalography devices are simply not designed with black hair in mind, which creates implicit racial bias. This means that these devices were designed with insufficient data from black people. It is important for neurotechnology innovators to focus on using sufficient and relevant data from Africa in the design. Africa needs to focus more on generating and having ownership of Findability, Accessibility, Interoperability, and Reusability (FAIR) datasets that can contribute to the design and development of neurotechnology in and for our societies [66].

Inadequate Regulatory Frameworks

One lesson from the recent AI boom in Africa is that many countries still lack a strong regulatory framework to address the challenges that emerging technologies like AI raise. While existing regulatory frameworks, such as data protection regulations, provide potential channels for integrating regulatory aspects of AI, the rapid advancements of this technology have outpaced the scope of these laws. The European Union’s AI Act has shown that to accommodate the dynamics of such a disruptive technology, a dedicated regulatory mechanism is required. Neurotechnology has been described as a disruptive innovation that will disrupt existing practices as well as traditional boundaries between medical therapies and consumer markets [67]. It has the potential to cause profound social and legal disruption. Owing to their increased capabilities aided by improved computational ability, machine learning, AI, and the availability of large-scale open-access databases, these technologies have the potential to become critical to future legal systems. There are possibilities of using them to predict the likelihood to recidivate, assess volition and intent, detect lies, as well as the potential to reduce recidivism [68]. There are concerns related to mental privacy and surveillance (especially workplace mental surveillance), issues related to equity, and other aspects of personal liberties, which may not fully be captured by existing regulations. Unlike in the case of AI, Africa does not need to play catch up. Relevant stakeholders, including policy makers and researchers, should be proactive in scrutinizing advances in neurotechnology. The time to act is now. There is a great need to develop an effective regulatory or governance framework that can promote responsible neurotechnology in a way that safety, ethical, and legal concerns are sufficiently addressed.

Regulations are important here because this is a technology that challenges existing laws and belief systems. There have been claims in the literature that the risks neurotechnology poses to fundamental freedoms of thought and expression demand new regulations to protect cognitive liberties [69] or neurorights [70]. The risks are significantly exacerbated by the increasing application of neurotechnology in the military as well as the consumer market for digital phenotyping, emotional information, neurogaming, and neuromarketing. These use cases highlight the possibilities of exerting control over brain activities and individual thoughts, which raise the risks of dual use of concern, digital mental surveillance, misuse of neurodata, and other privacy issues. As AI has shown [71], this technology surpasses the ability of existing laws, including data protection laws, to govern its design and development. This paper certainly does not make a case for neuroright laws but seeks to highlight the need to establish regulatory frameworks or amend existing ones that can make neurotechnologies align with African societal values and contexts.

Available Data Protection Regulations Are Inadequate

The global landscape of data protection regulation is significantly expanding, driven by increasing awareness of privacy issues and the need to regulate the growing digital ecosystem. Most importantly, the introduction of the European General Data Protection Regulation in 2018 is greatly influencing the global approach to data protection with its stringent requirements and extraterritorial scope. So far, over 30 African countries have established data protection laws and or regulations. It is important, however, to note that while data protection regulations play a crucial role in safeguarding the privacy of the individuals, the unique challenges posed by neurotechnology require additional measures and considerations [72,73]. Yuste [74] have raised awareness of the ability of implantable and nonimplantable neural devices to record and alter brain data in ways that jeopardize personal neuroprivacy. There is evidence in the literature to suggest that these devices can successfully decode mental imagery, emotions, story interpretation, and speech [49,75-77]. There are apparent voids in existing data protection regulations to address some of the complex issues involved here. They are not fully aligned to address specific risks and implications associated with the collection and application of brain data by neurotechnology.

To address this gap, some have proposed the establishment of novel human rights, neurorights [70,78], due to concerns around mental privacy, mental integrity, and cognitive liberty. Others have proposed a data-centered approach focused on revising data protection regulations to incorporate issues raised by neurotechnology [79]. As the landscape of neurotechnology continues to progress in Africa, it is important for African policy makers to understand that the available regulations and laws on data are not adequately equipped to address the complex ethical and legal issues neurotechnology raises. While the option of novel human rights is globally being discussed, the data protection–centered approach may be the most pragmatic approach to address the immediate, data-related risks involved in neurotechnology, given the claim for the exceptionalism of neurodata [80].

The Need for Stakeholder Engagement

At present, the debate over AI governance and regulation in Africa is being shaped by scientists, lawmakers, and scholars in the humanities and the social sciences. However, such debate tends to often lack representation from key stakeholders, which are citizens and community members, who are using these technologies and who will be subject to these new rights. There is a growing consensus among scholars, national governments, and technology corporations about the need to recognize and involve the public as active participants in the design of AI governance [81]. This is often discussed as the democratization of AI or algorithms [82]. AI can have profound impacts on culture, society, and citizens’ rights. Public engagement ensures that these impacts are proactively considered and that the established governance mechanisms reflect the values and priorities of those who will use them. Similarly, ethics and governance of neurotechnology in Africa will benefit greatly from public engagement, not only to raise awareness and understanding of the technology but also to inform the development of governance frameworks that are responsive to the needs and concerns of the public. Public engagement broadens the range of voices that can provide insights to better anticipate potential risks of the technology. It is important that such public engagement exercises are established to ensure public trust which is critical to technology acceptance.

The Possibility of Corporate Capture

In the absence of functional governance frameworks for responsible AI, technology companies have taken the lead by funding most of the global AI ethics research. This provides an opportunity to influence the research agenda [83]. This is what Gerdes [84] called corporate colonization of the AI ethics research agenda. Large parts of the global AI ethics research are funded by big technology companies fundamentally more interested in their profits rather than public interest [85]. Gerdes [84] also identifies conflicts of interest in AI public policy–making initiatives. Leveraging weak or nonexistent funding mechanisms, regulations, and institutions in Africa, there is a possibility for big neurotechnology companies to control research on ethics and governance of neurotechnology in Africa. Neurotechnology industry players can capture the narrative or discussion on the ethics of neurotechnology to their own benefit. This will have grave consequences in real life. The ethics and governance of neurotechnology, particularly in Africa, need multistakeholder engagement and less performative efforts from policy makers and innovators. It also needs independent (free from the big technology influence) research efforts that will not only inform governance but that can build a sustainable human and technical infrastructure in Africa.

Dangers of Overly Anthropomorphizing Technology

Human-technology interactions have shown that there is always a tendency to anthropomorphize technology [86]. Indeed, anthropomorphism has become part of the AI literature [87-89]. This is the attribution or projection of humanlike characteristics to inanimate objects, animals, and in this case technology [90]. This is a cognitive bias [91] informed by sociocultural awareness and beliefs. Anthropomorphizing AI can lead to unrealistic expectations and overtrust of the technology. It can blur the lines between humans and machines and consequent attribution of moral agency to machines. As a technology that can interface between the brain and computers, certain neurotechnological devices are developed to create more humanlike interactions. This raises the possibility of overly anthropomorphizing the technology, particularly in Africa, where anthropomorphism is already part of the cultural fabric through religion. This can lead to misconceptions of their capabilities and limitations, raising ethical and practical concerns. To mitigate the negative impacts of the anthropomorphism of neurotechnology, innovators and policy makers need to choose between creating user-friendly neuro-interfaces and maintaining transparency about the nature, capabilities, and limitations of the technology. This includes the education of relevant stakeholders on the roles, nature, and constraints of neurotechnology.


This paper makes an argument that neurotechnology is no longer a future technology; it is here and is now available not only for clinical research and practice but also to consumers. It is revolutionizing our understanding of the brain and its diseases; providing much needed therapies for a wide range of patients are increasingly used in direct-to-consumer products. Some of these technological devices are being designed in and for Africa [92]. However, the rapid advancement of this technology raises serious risks concerning safety, privacy, human rights, digital divide, bias, and discrimination. With evident weak or nonexistent ethical and regulatory institutions capable of ensuring responsible development and use in Africa, individuals and the society at large face serious risks. Without putting Africans and their needs, interests, values, principles, contexts, data, and expectations into consideration in its design and governance, neurotechnology risks discriminating against Africans as well as jeopardizing the privacy and safety of citizens. This is similar to what is happening in the field of AI. Stakeholders, including policy makers, innovators, and users, can learn the above lessons from AI ethics and governance to ensure that proactive actions are instituted to mitigate against the risks neurotechnology presents to Africans. These lessons need to be taken into consideration as public debates and governance mechanisms for neurotechnology are shaped in Africa. Proactivity and collaboration are the key to being responsive to the demands of mitigating the risks this technology poses. Researchers and scientists working in Africa also need to focus on providing evidence-based insights that can inform policy and practice. This includes consistently providing users and citizens in general with the awareness of the benefits and risks of neurotechnology, which is becoming the new and disruptive technology frontier.

Acknowledgments

This work was supported by the UK Research and Innovation Technology Mission Fund under the Engineering and Physical Sciences Research Council grant EP/Y009800/1, Wellcome Trust grant 226486/Z/22/Z, and the EDCTP2 program supported by the European Union under the grant agreement 101145644.

Conflicts of Interest

None declared.

  1. Hain DS, Jurowetzki R, Squicciarini M, Xu L. Unveiling the Neurotechnology Landscape: Scientific Advancements Innovations and Major Trends. Paris, France. UNESCO; 2023.
  2. International Brain Initiative. Electronic address: j.g.bjaalie@medisin.uio.no, International Brain Initiative. International brain initiative: an innovative framework for coordinated global brain research efforts. Neuron. 2020;105(2):212-216. [FREE Full text] [CrossRef] [Medline]
  3. Maina MB, Ahmad U, Ibrahim HA, Hamidu SK, Nasr FE, Salihu AT, et al. Two decades of neuroscience publication trends in Africa. Nat Commun. Jun 08, 2021;12(1):3429. [CrossRef] [Medline]
  4. Müller O, Rotter S. Neurotechnology: current developments and ethical issues. Front Syst Neurosci. 2017;11:93. [FREE Full text] [CrossRef] [Medline]
  5. Goering S, Brown T, Klein E. Neurotechnology ethics and relational agency. Philos Compass. 2021;16(4):e12734. [FREE Full text] [CrossRef] [Medline]
  6. MacDuffie KE, Ransom S, Klein E. Neuroethics inside and out: a comparative survey of neural device industry representatives and the general public on ethical issues and principles in neurotechnology. AJOB Neurosci. 2022;13(1):44-54. [CrossRef] [Medline]
  7. Guzmán L. Chile: pioneering the protection of neurorights. UNESCO Cour. 2022;2022:13-14. [CrossRef]
  8. Jorgenson LA, Newsome WT, Anderson DJ, Bargmann CI, Brown EN, Deisseroth K, et al. The BRAIN initiative: developing technology to catalyse neuroscience discovery. Philos Trans R Soc Lond B Biol Sci. 2015;370(1668):20140164. [FREE Full text] [CrossRef] [Medline]
  9. Recommendation on the ethics of artificial intelligence. UNESCO. 2023. URL: https://www.unesco.org/en/articles/recommendation-ethics-artificial-intelligence [accessed 2024-07-30]
  10. Mathieson K, Denison T, Winkworth-Smith C. A transformative roadmap for neurotechnology in the UK. Innovate UK Business Connect. 2021. URL: https:/​/iuk.​ktn-uk.org/​wp-content/​uploads/​2021/​06/​A-transformative-roadmap-for-neurotechnology-in-the-UK.​pdf [accessed 2022-06-10]
  11. Stieglitz T. Why neurotechnologies? About the purposes, opportunities and limitations of neurotechnologies in clinical applications. Neuroethics. 2021;14:5-16. [CrossRef]
  12. Ethics of Neurotechnology: UNESCO, Leaders and Top Experts Call for Solid Governance. UNESCO. 2023. URL: https:/​/www.​unesco.org/​en/​articles/​ethics-neurotechnology-unesco-leaders-and-top-experts-call-solid-governance [accessed 2024-07-30]
  13. Cometa A, Falasconi A, Biasizzo M, Carpaneto J, Horn A, Mazzoni A, et al. Clinical neuroscience and neurotechnology: an amazing symbiosis. iScience. 2022;25(10):105124. [FREE Full text] [CrossRef] [Medline]
  14. Lin Y, Wang Y. Neurostimulation as a promising epilepsy therapy. Epilepsia Open. 2017;2(4):371-387. [FREE Full text] [CrossRef] [Medline]
  15. Hofmeister M, Memedovich A, Brown S, Saini M, Dowsett LE, Lorenzetti DL, et al. Effectiveness of neurostimulation technologies for the management of chronic pain: a systematic review. Neuromodulation. 2020;23(2):150-157. [CrossRef] [Medline]
  16. Schuepbach W, Rau J, Knudsen K, Volkmann J, Krack P, Timmermann L, et al. EARLYSTIM Study Group. Neurostimulation for Parkinson's disease with early motor complications. N Engl J Med. 2013;368(7):610-622. [FREE Full text] [CrossRef] [Medline]
  17. Bergfeld I, Dijkstra E, Graat I, de Koning P, van den Boom BJG, Arbab T, et al. Invasive and non-invasive neurostimulation for OCD. In: Fineberg NA, Robbins TW, editors. The Neurobiology and Treatment of OCD: Accelerating Progress. Cham. Springer International; 2021:399-436.
  18. Akhtar H, Bukhari F, Nazir M, Anwar MN, Shahzad A. Therapeutic efficacy of neurostimulation for depression: techniques, current modalities, and future challenges. Neurosci Bull. 2016;32(1):115-126. [FREE Full text] [CrossRef] [Medline]
  19. Rachid F. Neurostimulation techniques in the treatment of cocaine dependence: a review of the literature. Addict Behav. 2018;76:145-155. [CrossRef] [Medline]
  20. Chari A, Hentall ID, Papadopoulos MC, Pereira EAC. Surgical neurostimulation for spinal cord injury. Brain Sci. 2017;7(2):18. [FREE Full text] [CrossRef] [Medline]
  21. Kleimaker M, Kleimaker A, Weissbach A, Colzato LS, Beste C, Bäumer T, et al. Non-invasive brain stimulation for the treatment of gilles de la tourette syndrome. Front Neurol. 2020;11:592258. [FREE Full text] [CrossRef] [Medline]
  22. Mendes L, Lima IN, Souza T, do Nascimento GC, Resqueti VR, Fregonezi GA. Motor neuroprosthesis for promoting recovery of function after stroke. Cochrane Database Syst Rev. 2020;1(1):CD012991. [FREE Full text] [CrossRef] [Medline]
  23. Charkhkar H, Christie BP, Triolo RJ. Sensory neuroprosthesis improves postural stability during sensory organization test in lower-limb amputees. Sci Rep. 2020;10(1):6984. [FREE Full text] [CrossRef] [Medline]
  24. Alon G, McBride K, Ring H. Improving selected hand functions using a noninvasive neuroprosthesis in persons with chronic stroke. J Stroke Cerebrovasc Dis. 2002;11(2):99-106. [CrossRef] [Medline]
  25. Lorach H, Galvez A, Spagnolo V, Martel F, Karakas S, Intering N, et al. Walking naturally after spinal cord injury using a brain-spine interface. Nature. 2023;618(7963):126-133. [FREE Full text] [CrossRef] [Medline]
  26. Okahara Y, Takano K, Nagao M, Kondo K, Iwadate Y, Birbaumer N, et al. Long-term use of a neural prosthesis in progressive paralysis. Sci Rep. 2018;8(1):16787. [FREE Full text] [CrossRef] [Medline]
  27. Bailes AF, Caldwell C, Clay M, Tremper M, Dunning K, Long J. An exploratory study of gait and functional outcomes after neuroprosthesis use in children with hemiplegic cerebral palsy. Disabil Rehabil. 2017;39(22):2277-2285. [CrossRef] [Medline]
  28. Eapen BC, Murphy DP, Cifu DX. Neuroprosthetics in amputee and brain injury rehabilitation. Exp Neurol. 2017;287(Pt 4):479-485. [CrossRef] [Medline]
  29. Reinkensmeyer DJ, Dietz V, editors. Neurorehabilitation Technology. Cham. Springer International Publishing; 2016.
  30. Soekadar SR, Birbaumer N, Slutzky MW, Cohen LG. Brain-machine interfaces in neurorehabilitation of stroke. Neurobiol Dis. 2015;83:172-179. [FREE Full text] [CrossRef] [Medline]
  31. Tonin L, Millán JDR. Noninvasive brain–machine interfaces for robotic devices. Annu Rev Control Robot Auton Syst. 2021;4(1):191-214. [CrossRef]
  32. Global neurotech devices market research report. BCC Research; 2023. URL: https://www.bccresearch.com/market-research/information-technology/neurotech-devices-market.html [accessed 2024-07-30]
  33. GBD 2015 Neurological Disorders Collaborator Group. Global, regional, and national burden of neurological disorders during 1990-2015: a systematic analysis for the Global Burden of Disease Study 2015. Lancet Neurol. Nov 2017;16(11):877-897. [FREE Full text] [CrossRef] [Medline]
  34. Feigin VL, Lawes CMM, Bennett DA, Barker-Collo SL, Parag V. Worldwide stroke incidence and early case fatality reported in 56 population-based studies: a systematic review. Lancet Neurol. 2009;8(4):355-369. [CrossRef] [Medline]
  35. Ngugi A, Bottomley C, Kleinschmidt I, Sander JW, Newton CR. Estimation of the burden of active and life-time epilepsy: a meta-analytic approach. Epilepsia. 2010;51(5):883-890. [FREE Full text] [CrossRef] [Medline]
  36. Pringsheim T, Jette N, Frolkis A, Steeves TDL. The prevalence of Parkinson's disease: a systematic review and meta-analysis. Mov Disord. Nov 2014;29(13):1583-1590. [CrossRef] [Medline]
  37. Prince M, Ali GC, Guerchet M, Prina AM, Albanese E, Wu YT. Recent global trends in the prevalence and incidence of dementia, and survival with dementia. Alzheimers Res Ther. 2016;8(1):23. [FREE Full text] [CrossRef] [Medline]
  38. Winkler AS, Mosser P, Schmutzhard E. Neurological disorders in rural Africa: a systematic approach. Trop Doct. 2009;39(2):102-104. [CrossRef] [Medline]
  39. Kaddumukasa M, Mugenyi L, Kaddumukasa MN, Ddumba E, Devereaux M, Furlan A, et al. Prevalence and incidence of neurological disorders among adult ugandans in rural and urban Mukono district; a cross-sectional study. BMC Neurol. 2016;16(1):227. [FREE Full text] [CrossRef] [Medline]
  40. Adoukonou T, Adogblé L, Agbétou M, Gnonlonfoun DD, Houinato D, Ouendo EM. Prevalence of the major neurological disorders in a semi-urban community in northern Benin. eNeurologicalSci. 2020;19:100242. [FREE Full text] [CrossRef] [Medline]
  41. Sankoh O, Sevalie S, Weston M. Mental health in Africa. The Lancet Global Health. 2018;6(9):e954-e955. [CrossRef]
  42. Greene MC, Yangchen T, Lehner T, Sullivan PF, Pato CN, McIntosh A, et al. The epidemiology of psychiatric disorders in Africa: a scoping review. Lancet Psychiatry. 2021;8(8):717-731. [FREE Full text] [CrossRef] [Medline]
  43. Silberberg D, Katabira E. Neurological disorders. In: Jamison D, Feachem R, Makgoba M, Baingana F, Hofman K, Rogo K, editors. Disease and mortality in sub-saharan Africa. Washingtion, DC. The International Bank for Reconstruction and Development / The World Bank; 2011.
  44. Gberie L. Mental Illness: Invisible but Devastating. Africa Renewal URL: https:/​/www.​un.org/​africarenewal/​magazine/​december-2016-march-2017/​mental-illness-invisible-devastating [accessed 2024-07-30]
  45. Wiskerke E, Kool J, Hilfiker R, Sattelmayer M, Verheyden G. Neurorehabilitation including virtual-reality-based balance therapy: factors associated with training response. Brain Sci. 2024;14(3):263. [FREE Full text] [CrossRef] [Medline]
  46. Prats-Bisbe A, López-Carballo J, García-Molina A, Leno-Colorado D, García-Rudolph A, Opisso E, et al. Virtual reality–based neurorehabilitation support tool for people with cognitive impairments resulting from an acquired brain injury: usability and feasibility study. JMIR Neurotech. 2024;3:e50538. [CrossRef]
  47. Schick T, editor. Functional electrical stimulation in neurorehabilitation: synergy effects of technology and therapy. Cham. Springer International Publishing; 2022.
  48. Hailey D, Roine R, Ohinmaa A, Dennett L. The status of telerehabilitation in neurological applications. J Telemed Telecare. 2013;19(6):307-310. [CrossRef] [Medline]
  49. Tang J, LeBel A, Jain S, Huth AG. Semantic reconstruction of continuous language from non-invasive brain recordings. Nat Neurosci. 2023;26(5):858-866. [CrossRef] [Medline]
  50. Pfotenhauer SM, Frahm N, Winickoff D, Benrimoh D, Illes J, Marchant G. Mobilizing the private sector for responsible innovation in neurotechnology. Nat Biotechnol. 2021;39(6):661-664. [CrossRef] [Medline]
  51. Shanmuganathan S. Artificial neural network modelling: an introduction. In: Shanmuganathan S, Samarasinghe S, editors. Artificial Neural Network Modelling. Cham. Springer International Publishing; 2016:1-14.
  52. Eke DO, Chintu SS, Wakunuma K. Towards shaping the future of responsible AI in Africa. In: Responsible AI in Africa: Challenges and Opportunities. Cham. Springer International Publishing; 2023:169-193.
  53. Eke D, Ogoh G. Forgotten African AI narratives and the future of AI in Africa. Int Rev Inf Ethics. 2022;31(1):8. [CrossRef]
  54. Eke DO, Wakunuma K, Akintoye S. Introducing Responsible AI in Africa. Africa. Springer; 2023:1-11.
  55. Ruttkamp-Bloem E. Epistemic just and dynamic AI ethics in Africa. In: In Responsible AI in Africa: Challenges and Opportunities. Cham. Springer International Publishing; 2023:13-34.
  56. Mittelstadt B. Principles alone cannot guarantee ethical AI. Nat Mach Intell. 2019;1:501-507. [CrossRef]
  57. Munn L. The uselessness of AI ethics. AI Ethics. 2022;3(3):869-877. [CrossRef]
  58. Rességuier A, Rodrigues R. AI ethics should not remain toothless! A call to bring back the teeth of ethics. Big Data Soc. 2020;7(2):205395172094254. [CrossRef]
  59. Vakkuri V, Kemell KK, Jantunen M, Abrahamsson P. "This Is Just a Prototype": How ethics are ignored in software startup-like environments. 2020. Presented at: Proceedings of the International Conference on Agile Software Development; 2020 May 28:195-210; Finland. [CrossRef]
  60. Baker D, Hanna A. AI Ethics Are in Danger. Funding Independent Research Could Help (SSIR). URL: https://ssir.org/articles/entry/ai_ethics_are_in_danger_funding_independent_research_could_help [accessed 2023-11-06]
  61. Shaw J, Ali J, Atuire CA, Cheah PY, Español AG, Gichoya JW, et al. Research ethics and artificial intelligence for global health: perspectives from the global forum on bioethics in research. BMC Med Ethics. 2024;25(1):46. [FREE Full text] [CrossRef] [Medline]
  62. Carter D. Unethical Outsourcing: ChatGPT Uses Kenyan Workers for Traumatic Moderation. URL: https:/​/www.​brusselstimes.com/​355283/​unethical-outsourcing-chatgpt-uses-kenyan-workers-for-traumatic-moderation [accessed 2024-05-01]
  63. Louis CC, Webster CT, Gloe LM, Moser JS. Hair me out: highlighting systematic exclusion in psychophysiological methods and recommendations to increase inclusion. Front Hum Neurosci. 2022;16:1058953. [FREE Full text] [CrossRef] [Medline]
  64. Ricard JA, Parker TC, Dhamala E, Kwasa J, Allsop A, Holmes AJ. Confronting racially exclusionary practices in the acquisition and analyses of neuroimaging data. Nat Neurosci. 2023;26(1):4-11. [CrossRef] [Medline]
  65. Lofton T. How one patient's textured hair nearly kept her from a needed EEG. KFF Health News. URL: https://kffhealthnews.org/news/article/black-textured-hair-eeg-racial-barriers/ [accessed 2024-07-23]
  66. Wogu E, Filima P, Caron B, Levitas D, Herholz P, Leal C, et al. A labeled clinical-MRI dataset of Nigerian brains. ArXiv. 2023. [FREE Full text] [Medline]
  67. OECD Recommendation on Responsible Innovation in Neurotechnology - OECD. URL: https://www.oecd.org/science/recommendation-on-responsible-innovation-in-neurotechnology.htm [accessed 2023-11-03]
  68. National Academies of Sciences, Engineering, and Medicine, Policy and Global Affairs, Committee on Science, Technology, and Law, Health and Medicine Division, et al. Forum on Neuroscience and Nervous System Disorders. Use of Neurotechnologies and Neuroscience in Legal Settings: Case Studies. Washington, DC. National Academies Press (US); 2018.
  69. Farahany NA. The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. Durham, NC. St. Martin's Press; 2023.
  70. Yuste R, Genser J, Herrmann S. It's time for neuro-rights. Horizons. 2021;18:154-164. [FREE Full text]
  71. Gutierrez C. The Unforeseen Consequences of Artificial Intelligence (AI) on Society: A Systematic Review of Regulatory Gaps Generated by AI in the U.S. California. RAND Corporation; 2020.
  72. Rainey S, McGillivray K, Akintoye S, Fothergill T, Bublitz C, Stahl B. Is the European data protection regulation sufficient to deal with emerging data concerns relating to neurotechnology? J Law Biosci. 2020;7(1):lsaa051. [FREE Full text] [CrossRef] [Medline]
  73. Ienca M, Fins JJ, Jox RJ, Jotterand F, Voeneky S, Andorno R, et al. Towards a governance framework for brain data. Neuroethics. 2022;15(20). [CrossRef]
  74. Yuste R. Advocating for neurodata privacy and neurotechnology regulation. Nat Protoc. 2023;18(10):2869-2875. [CrossRef] [Medline]
  75. Grover S, Wen W, Viswanathan V, Gill CT, Reinhart RMG. Long-lasting, dissociable improvements in working memory and long-term memory in older adults with repetitive neuromodulation. Nat Neurosci. 2022;25(9):1237-1246. [FREE Full text] [CrossRef] [Medline]
  76. Takagi Y, Nishimoto S. High-resolution image reconstruction with latent diffusion models from human brain activity. IEEE; 2022. Presented at: 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR); 2023 June 17; BC, Canada. [CrossRef]
  77. Défossez A, Caucheteux C, Rapin J, Kabeli O, King J. Decoding speech perception from non-invasive brain recordings. Nat Mach Intell. 2023;5(10):1097-1107. [CrossRef]
  78. Ienca M. On neurorights. Front Hum Neurosci. 2021;15:701258. [FREE Full text] [CrossRef] [Medline]
  79. Rainey S, Dalese P. An alternative focus on data in the neurorights discussion – lessons from Brazil. Bioeth Open Res. 2023;1:3. [CrossRef]
  80. Hallinan D, Akintoye S, Stahl BC, Eke DO. Legal neuroexceptionalism: framing a concept. Eur J Law Technol. 2023;14(2). [FREE Full text]
  81. Wilson C. Public engagement and AI: a values analysis of national strategies. Gov Inf Q. 2022;39(1):101652. [CrossRef]
  82. Janssen M, Kuk G. The challenges and limits of big data algorithms in technocratic governance. Gov Inf Q. 2016;33(3):371-377. [CrossRef]
  83. Abdalla M. The grey hoodie project: Big tobacco, big tech, and the threat on academic integrity. 2021. Presented at: Proceedings of the Proceedings of the 2021 AAAI/ACM Conference on AI, Ethics, and Society, Association for Computing Machinery; 2021 July 30; New York, USA. [CrossRef]
  84. Gerdes A. The tech industry hijacking of the AI ethics research agenda and why we should reclaim it. Discov Artif Intell. 2022;2(1):25. [CrossRef]
  85. Williams O. New Statesman. How big tech funds the debate on AI ethics. 2019. URL: https://www.newstatesman.com/science-tech/2019/06/how-big-tech-funds-debate-ai-ethics [accessed 2024-08-23]
  86. Festerling J, Siraj I. Anthropomorphizing technology: a conceptual review of anthropomorphism research and how it relates to children's engagements with digital voice assistants. Integr Psychol Behav Sci. 2022;56(3):709-738. [FREE Full text] [CrossRef] [Medline]
  87. Proudfoot D. Anthropomorphism and AI: turingʼs much misunderstood imitation game. Artif Intell. 2011;175(5-6):950-957. [CrossRef]
  88. Salles A, Evers K, Farisco M. Anthropomorphism in AI. AJOB Neurosci. 2020;11(2):88-95. [CrossRef] [Medline]
  89. Li M, Suh A. Anthropomorphism in AI-enabled technology: a literature review. Electron Markets. 2022;32(4):2245-2275. [CrossRef]
  90. Airenti G. The development of anthropomorphism in interaction: intersubjectivity, imagination, and theory of mind. Front Psychol. 2018;9:2136. [FREE Full text] [CrossRef] [Medline]
  91. Dacey M. Anthropomorphism as cognitive bias. Philos Sci. 2017;84:1152-1164. [CrossRef]
  92. Ndambi A. Congolese Student Makes Brain-Computer Interface Tool. 2023. URL: https://www.voaafrica.com/a/congolese-student-makes-brain-computer-interface-tool-/7034060.html [accessed 2023-04-09]


AI: artificial intelligence
FAIR: Findability, Accessibility, Interoperability, and Reusability
MRI: magnetic resonance imaging
OECD: Organization for Economic Cooperation and Development
UNESCO: United Nations Educational, Scientific and Cultural Organization


Edited by P Kubben; submitted 23.01.24; peer-reviewed by H Van Lente, A Carter; comments to author 28.03.24; revised version received 05.05.24; accepted 12.06.24; published 29.08.24.

Copyright

©Damian Eke. Originally published in JMIR Neurotechnology (https://neuro.jmir.org), 29.08.2024.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Neurotechnology, is properly cited. The complete bibliographic information, a link to the original publication on https://neuro.jmir.org, as well as this copyright and license information must be included.