SUMMARY - Defining “Emerging Technologies”

Baker Duck
Submitted by pondadmin on

A policymaker drafts legislation regulating "emerging technologies" to ensure ethical development, but struggles to specify which technologies the law covers. Artificial intelligence was emerging a decade ago but now powers billions of daily interactions. Blockchain was revolutionary in 2017 but has become routine infrastructure for some applications while failing to achieve predicted disruption in others. Quantum computing has been emerging for twenty years, always five to ten years away from practical deployment. Gene editing was speculative science fiction until CRISPR made it accessible to graduate students. Brain-computer interfaces exist in research laboratories and in thousands of patients with implanted devices, simultaneously experimental and deployed. The term "emerging technologies" appears in countless policy documents, ethics frameworks, and regulatory proposals, yet what it actually means, which technologies qualify, and when emergence ends and establishment begins remains frustratingly unclear. Whether this ambiguity enables flexible governance or undermines coherent regulation of technologies that may fundamentally reshape human life remains profoundly contested.

The Case for Maintaining Flexible Definitions

Advocates argue that precise definitions of emerging technologies are neither achievable nor desirable, and that governance frameworks should embrace flexibility rather than seeking false precision. From this view, the essence of emerging technologies is their uncertainty. Technologies that are genuinely emerging cannot be fully characterized because their capabilities, applications, and implications are still developing. Definitions that seemed adequate when written become obsolete as technologies evolve in unexpected directions.

Artificial intelligence illustrates this challenge. Definitions from a decade ago focused on narrow capabilities like image recognition and game playing. Those definitions did not anticipate large language models, generative AI, or multimodal systems that emerged more recently. Regulations written around earlier definitions would have missed the most consequential developments.

Flexibility enables governance to adapt as technologies evolve. Principles-based approaches establishing values like transparency, accountability, and human oversight can apply across technologies without requiring precise categorization. Risk-based frameworks assessing potential harms can address whatever technologies emerge without needing to anticipate them specifically. Adaptive governance that responds to developments as they occur may be more effective than static definitions that technology will inevitably outpace.

Moreover, premature definition may constrain beneficial development. Technologies in early stages exhibit diverse possibilities that narrow over time. Defining and regulating emerging technologies before their trajectory is clear may lock in particular directions while foreclosing alternatives that might have been better. Regulatory patience enables exploration that early definition would prevent.

From this perspective, governance should: establish principles applicable across emerging technologies rather than technology-specific rules; create adaptive mechanisms enabling response to developments as they occur; maintain flexibility to address technologies that cannot yet be anticipated; and accept that some ambiguity is preferable to false precision that will prove inadequate.

The Case for Clear Definitional Frameworks

Others argue that definitional ambiguity undermines governance, enabling both regulatory evasion and regulatory overreach while providing no clear guidance for developers, deployers, or affected populations. From this view, laws and regulations require definitions to be enforceable. A regulation covering "emerging technologies" without specifying which technologies are included creates uncertainty about what is required of whom. Developers cannot know whether their work is subject to requirements. Regulators cannot consistently determine what to enforce. Courts cannot adjudicate disputes without knowing what the law covers.

Ambiguity enables strategic manipulation. Organizations can claim their technology is not emerging when they want to avoid regulation and claim it is emerging when they want to access innovation-friendly frameworks. Without clear definitions, categorization becomes negotiation where those with more resources and influence achieve favorable classification.

Different technologies require different governance. The ethical considerations for artificial intelligence differ from those for synthetic biology, which differ from those for neurotechnology. Treating all emerging technologies uniformly through general principles may miss what is distinctive about each. Precision enables tailored governance addressing specific characteristics and risks.

Moreover, public deliberation about technology requires shared understanding of what is being discussed. Democratic input into technology governance depends on citizens understanding what technologies are at stake. Vague references to emerging technologies prevent meaningful public engagement because people cannot evaluate what they do not understand.

From this perspective, governance requires: explicit definitions specifying which technologies are covered by which frameworks; criteria for determining when technologies have emerged sufficiently to trigger regulation; periodic review and updating of definitions as technology evolves; and clarity enabling compliance, enforcement, and democratic deliberation.

The Temporal Dimension Problem

When does a technology emerge, and when does emergence end? Technologies do not announce their arrival or departure from the emerging category. From one view, emergence describes a phase in technology development between initial research and widespread deployment. Technologies emerge when they move from laboratory to application and stop emerging when they become routine.

But this framing obscures continuous evolution. Artificial intelligence has been in use for decades while continuing to exhibit novel capabilities. The internet emerged in the 1990s but continues developing in ways that raise new ethical questions. Technologies do not simply arrive and then remain static.

From another view, emergence should be defined by uncertainty rather than novelty. A technology is emerging while its capabilities, applications, and implications remain uncertain. By this definition, some older technologies remain emerging while some newer ones do not.

Whether emergence is temporal phase, developmental stage, or condition of uncertainty shapes what qualifies as emerging and for how long.

The Domain Specificity Challenge

Technologies may be established in some domains while emerging in others. Artificial intelligence is mature for image recognition but emerging for scientific discovery. Blockchain is established for cryptocurrency but emerging for supply chain management. Gene editing is routine for some agricultural applications but emerging for human therapeutics.

From one perspective, governance should address technologies at the application level rather than the technology level. AI in healthcare requires different governance than AI in entertainment, regardless of underlying technical similarities.

From another perspective, application-specific governance creates fragmentation and gaps. Technologies that cross domain boundaries may fall between regulatory frameworks. Coherent technology governance requires addressing technologies themselves, not just their applications.

Whether emerging technologies should be defined and governed at the technology level or application level shapes regulatory architecture.

The Capability Versus Deployment Distinction

A technology might have capabilities that have emerged while deployment remains limited. Quantum computing has demonstrated computational capabilities but is not widely deployed. Brain-computer interfaces have achieved significant milestones in research settings but are not consumer products. Gene drives have been created in laboratories but not released into environments.

From one view, governance should address capabilities regardless of deployment, regulating what technologies can do before they are widely used. Waiting for deployment may be waiting until harms have materialized.

From another view, regulating capabilities before deployment may be premature. Many technically possible capabilities never achieve practical deployment. Regulating based on theoretical capabilities may impose burdens on technologies that never materialize.

Whether emerging technology governance should address capabilities or deployment shapes regulatory timing.

The Convergence Complication

Technologies increasingly converge, combining in ways that produce capabilities none would have alone. AI combined with biotechnology enables automated drug discovery. Neurotechnology combined with AI enables brain-computer interfaces that interpret neural signals. Nanotechnology combined with materials science enables novel substances with unprecedented properties.

From one perspective, convergent technologies require governance addressing combinations rather than individual technologies. The ethical implications of AI-enabled biotechnology may differ from those of AI or biotechnology alone.

From another perspective, governing every possible combination is impossible. Convergence multiplies categories beyond any manageable framework. Governance should address individual technologies with principles that apply regardless of combination.

Whether converging technologies require specific governance or whether general principles suffice shapes framework comprehensiveness.

The Speed of Change Challenge

Technologies emerge at different rates. Some, like CRISPR gene editing, move from discovery to widespread application within a few years. Others, like quantum computing, remain emerging for decades. Social media platforms achieved global scale in less than a decade while fusion energy has been emerging for seventy years.

From one view, governance should be calibrated to speed of change, with faster-moving technologies receiving more urgent attention and more adaptive governance mechanisms.

From another view, speed of change is difficult to predict. Technologies that seem slow-moving may suddenly accelerate. Technologies that seem fast-moving may stall. Governance based on predicted speed may be wrong about actual trajectories.

Whether speed of emergence should shape governance approach and how to predict it shapes regulatory strategy.

The Global Variation Problem

Technologies emerge differently across jurisdictions. A technology that is established in one country may be emerging in another. Regulatory frameworks that assume uniform global emergence may not reflect actual variation. A technology that American regulators consider mature may be just arriving in other contexts.

From one perspective, global governance frameworks should accommodate variation, recognizing that emergence is contextual rather than universal.

From another perspective, global technology development means that technologies emerge globally even if adoption varies. Governance frameworks should address technologies themselves rather than local adoption patterns.

Whether emergence should be defined globally or contextually shapes international coordination.

The Dual-Use Dimension

Many emerging technologies have both beneficial and harmful applications. AI can diagnose diseases or enable surveillance. Gene editing can cure genetic disorders or create biological weapons. Brain-computer interfaces can restore function to paralyzed patients or enable invasive monitoring. Drones can deliver medicine to remote areas or conduct autonomous attacks.

From one view, dual-use potential is defining characteristic of emerging technologies requiring governance that addresses both promise and peril.

From another view, dual-use is characteristic of most technologies and does not distinguish emerging technologies from established ones. Knives, automobiles, and electricity all have beneficial and harmful uses.

Whether dual-use potential should be central to defining emerging technologies or whether it is too common to be definitionally useful shapes how emergence is characterized.

The Public Perception Factor

Technologies may be perceived as emerging regardless of their actual development stage. Media coverage, science fiction, and public imagination shape perceptions of what technologies are new and transformative. Artificial intelligence is perceived as emerging despite decades of development. Nuclear technology is perceived as established despite ongoing innovation.

From one perspective, public perception matters for governance. Technologies perceived as emerging attract attention, funding, and regulatory interest regardless of technical reality. Governance must engage with perceptions as well as technical facts.

From another perspective, governance should be based on technical reality rather than perception. Public misunderstanding should be corrected rather than incorporated into regulatory frameworks.

Whether public perception should influence how emerging technologies are defined or whether technical criteria should prevail shapes governance approach.

The Research Versus Commercial Distinction

Technologies may be emerging in research while established commercially, or vice versa. Academic research may be exploring frontiers while commercial applications remain limited. Alternatively, commercial deployment may race ahead while research tries to understand implications.

From one view, governance should distinguish research and commercial phases, with different requirements for each. Research requires freedom to explore while commercial deployment requires accountability for impacts.

From another view, the research-commercial distinction is increasingly blurred. Technology companies conduct research while deploying products. Academic research has immediate commercial applications. Clear separation may no longer be possible.

Whether the research-commercial distinction helps define emerging technologies or whether it obscures more than it clarifies shapes regulatory design.

The Inclusion Criteria Debate

Lists of emerging technologies vary significantly across organizations and jurisdictions. Some include artificial intelligence, biotechnology, quantum computing, and nanotechnology. Others add neurotechnology, autonomous systems, advanced materials, or space technologies. Some focus narrowly on digital technologies while others encompass physical, biological, and informational domains.

From one perspective, inclusion criteria should be explicit, specifying what characteristics qualify technologies for the emerging category. Criteria might include recency of development, rate of change, transformative potential, ethical significance, or governance gaps.

From another perspective, any criteria will be contested, and different contexts may require different inclusion decisions. Flexibility to include or exclude technologies based on context may be more practical than fixed criteria.

Whether explicit inclusion criteria are necessary or whether contextual judgment suffices shapes how the category is bounded.

The Hype Cycle Consideration

Technologies often follow patterns of inflated expectations followed by disillusionment followed by eventual productive deployment. Technologies at the peak of inflated expectations may be considered emerging regardless of actual capability. Technologies in the trough of disillusionment may be dismissed despite significant potential.

From one view, governance should account for hype cycles, maintaining attention to technologies through disillusionment phases when genuine potential remains even if enthusiasm has waned.

From another view, hype cycles are themselves social phenomena that governance should not simply accept. Regulatory frameworks should be based on technical assessment rather than following enthusiasm or dismissal.

Whether hype cycles should influence how emerging technologies are understood or whether they distort assessment shapes evaluation approach.

The Existential Risk Framing

Some emerging technologies are framed as posing existential risks to humanity. Advanced artificial intelligence, engineered pandemics, and nanotechnology have all been described as potentially threatening human survival. This framing elevates certain technologies above others in urgency and concern.

From one perspective, existential risk potential should be central to defining which emerging technologies require governance priority. Technologies that could cause civilizational harm deserve more attention than those with limited downside.

From another perspective, existential risk framing may be speculative, may distract from more immediate harms, and may reflect particular worldviews rather than objective assessment. Governance should address demonstrable risks rather than speculative ones.

Whether existential risk framing helps prioritize emerging technology governance or whether it distorts attention shapes how technologies are ranked.

The Indigenous and Traditional Knowledge Intersection

Emerging technologies often intersect with indigenous and traditional knowledge. Gene editing of culturally significant species, AI trained on traditional knowledge, and neurotechnology affecting cultural practices raise questions about whose knowledge and consent matter.

From one perspective, indigenous and traditional knowledge perspectives should inform how emerging technologies are defined and governed, particularly when technologies affect indigenous communities or draw on traditional knowledge.

From another perspective, emerging technology governance is primarily about future-oriented innovation, and connection to traditional knowledge, while important, does not define what makes technologies emerging.

Whether indigenous and traditional knowledge perspectives should shape emerging technology definitions or whether they are separate considerations affects governance inclusivity.

The Democratic Deliberation Need

Decisions about which technologies are emerging and how they should be governed have profound implications for society. These decisions should not be made solely by technical experts or industry stakeholders but should involve democratic deliberation.

From one view, defining emerging technologies requires public input because the implications affect everyone. Citizens should help determine which technologies receive attention and how governance frameworks are structured.

From another view, technical complexity limits meaningful public participation. Most people cannot evaluate quantum computing or synthetic biology. Expert judgment, while imperfect, may be more reliable than uninformed public opinion.

Whether democratic deliberation should shape emerging technology definitions or whether expertise should predominate affects governance legitimacy.

The Question

If emerging technologies are defined by uncertainty about their capabilities, applications, and implications, can governance frameworks address technologies whose very nature is not yet understood, or does effective regulation require waiting until emergence is complete and characteristics are clear? When the same technology may be emerging in one application domain while established in another, emerging in one jurisdiction while mature in another, and perceived as emerging regardless of actual development stage, can any single definition capture what makes technologies emerging, or is definitional ambiguity inevitable for technologies whose defining feature is that they are still taking shape? And if governance requires clear definitions to be enforceable while emerging technologies resist clear definition by their nature, should frameworks sacrifice precision for flexibility, sacrifice flexibility for precision, or accept that governing emerging technologies means governing what cannot yet be fully defined?

0
| Comments
0 recommendations