Inability to Program Complex Organisms and Developmental Pathways
Synthetic Biology
0 capabilities · 5 candidate tools
Current genetic tools primarily enable modification of simple organisms. Programming more complex organisms and orchestrating entire developmental pathways remains a major challenge.
Candidate tools
Programmable genetic circuits are engineered genetic constructs used to create designer cells with controllable behaviors in mammalian synthetic biology. The cited literature describes circuits that can incorporate targetable DNA-binding systems such as CRISPR/Cas9 and sensor-actuator devices to regulate complex cellular functions with high spatial and temporal resolution.
The gap is about programming complex organisms, and this item is explicitly described as a mammalian synthetic biology construct pattern for controllable behaviors with sensor-actuator devices and targetable DNA-binding systems. That makes it a plausible starting architecture for coordinating multi-gene control programs in more complex cellular contexts.
Assumptions: Assumes mammalian synthetic biology is a relevant proxy for at least part of the complex-organism programming gap.
Missing evidence: No direct evidence here for developmental pathway control, whole-organism deployment, or robust in vivo performance.
Complex genetic networks are synthetic biology construct patterns assembled from interchangeable, standardized bio-parts into sensing, information processing, and effector modules. They are described as integrated gene network architectures for input-responsive control of downstream functions.
Programming developmental pathways likely requires multi-input sensing, information processing, and coordinated effector control, which matches the described architecture of complex genetic networks. The modular sensing-processing-effector framing is mechanistically relevant to building higher-order control programs.
Assumptions: Assumes integrated gene-network design is relevant to developmental control even though development is not explicitly tested here.
Missing evidence: No direct evidence for use in complex organisms, developmental systems, or long-timescale multicellular patterning.
Material-to-cell synNotch ligand platforms are engineered biomaterial and extracellular matrix systems that present synNotch ligands to mammalian cells. In the reported 2023 implementation, ligands were covalently incorporated into gelatin hydrogels or into cell-generated fibronectin-containing extracellular matrix to activate synthetic Notch receptors and induce programmed transcriptional outputs.
Developmental programming often depends on spatially presented extracellular cues, and this platform provides material-presented ligands that activate synNotch receptors to induce transcriptional programs in mammalian cells. That is a concrete way to impose engineered cell-fate or signaling inputs in multicellular contexts.
Assumptions: Assumes engineered extracellular signaling control is relevant to developmental pathway programming.
Missing evidence: No direct evidence here for full developmental pathway orchestration, tissue-scale patterning outcomes, or organism-level use.
This computation method is a predictive design framework for transcriptional programs reported in Performance Prediction of Fundamental Transcriptional Programs. It uses experimentally characterized single-input logical operations and associated metrology to model and predict the performance of more complex compressed transcriptional logic programs, including two-input AND, NOR, and mixed-phenotype NIMPLY operations.
A bottleneck in programming complex systems is designing transcriptional logic that behaves predictably, and this item is specifically a predictive framework for transcriptional program performance. It could help reduce design uncertainty when assembling more complex control programs.
Assumptions: Assumes transcriptional program design is a central subproblem of the stated gap.
Missing evidence: No direct evidence for developmental biology applications, multicellular systems, or complex-organism validation.
Dynamic regulation is a metabolic engineering method that modulates gene expression over time to rebalance metabolic fluxes in response to changing cellular or fermentation conditions. It is used to build responsive cell factories rather than relying on fixed static control.
Developmental and complex-organism programming often requires time-varying control rather than static expression, and dynamic regulation is directly about modulating gene expression over time. That temporal-control concept is relevant, but the supplied evidence is centered on metabolic engineering rather than development.
Assumptions: Assumes temporal gene-expression control methods may transfer conceptually beyond metabolic engineering.
Missing evidence: Evidence is missing for use in complex organisms, mammalian development, or pathway-level developmental orchestration.
Synthetic Biology Platforms Are Over-Reliant on Evolved Cells That We Don’t Fully Understand or Control
Nanoscale Fabrication
0 capabilities · 5 candidate tools
We currently perform synthetic biology using naturally evolved (“kludgy”) cells rather than truly bottom-up engineered cells. This bottleneck limits our ability to design fully customizable biological systems.
Candidate tools
GUBS (Genomic Unified Behavior Specification) is a domain-specific, rule-based declarative language for behavioral specification of synthetic biological devices. It represents device programs as behavioral specifications for open systems rather than as complete closed-system descriptions.
This item directly targets a core bottleneck in bottom-up biological engineering: specifying desired device behavior in a formal, programmable way rather than relying only on poorly understood natural cellular context. Its open-system behavioral specification framing is plausibly relevant to designing more controllable synthetic biological systems.
Assumptions: Assumes the gap includes design-specification and programmability problems, not only wet-lab chassis construction.
Missing evidence: No evidence here that GUBS has been used to build bottom-up engineered cells, minimal cells, or nanoscale fabrication systems.
Programmable genetic circuits are engineered genetic constructs used to create designer cells with controllable behaviors in mammalian synthetic biology. The cited literature describes circuits that can incorporate targetable DNA-binding systems such as CRISPR/Cas9 and sensor-actuator devices to regulate complex cellular functions with high spatial and temporal resolution.
Programmable genetic circuits are actionable construct patterns for imposing designed sensing and control logic on cells, which is relevant to reducing reliance on unmanaged native behavior. They could help make biological platforms more customizable even if they do not themselves create bottom-up synthetic cells.
Assumptions: Assumes partial mitigation of the gap through stronger engineered control over existing cells is still relevant.
Missing evidence: No direct evidence here for use in cell-free, protocell, or fully bottom-up chassis contexts.
Mathematical and statistical modelling is a computational design approach used in synthetic biology to improve the predictability of engineered biological systems. In the cited plant synthetic biology literature, it supports model-informed rational design for engineering plant gene regulation and metabolism.
The gap explicitly concerns limited understanding and control, and model-based design is one of the few supplied items that directly addresses predictability. It could support more rational bottom-up design workflows by reducing dependence on trial-and-error in evolved cells.
Assumptions: Assumes computational predictability is part of the bottleneck described by the gap.
Missing evidence: Evidence is from plant synthetic biology and does not show direct application to synthetic cells, minimal cells, or nanoscale fabrication.
Model-informed rational design is an engineering method in synthetic biology that uses models to guide the design of biological systems. In the cited plant context, it has been successfully applied to engineering plant gene regulation and metabolism.
This method is plausibly relevant because the gap is about moving toward intentionally designed biological systems, and model-informed design can improve rational engineering over ad hoc use of evolved cells. It is a practical first-step method for specifying and iterating designs.
Assumptions: Assumes design methodology is in scope even without a specific molecular implementation.
Missing evidence: Mechanistic detail is sparse, and the supplied evidence is limited to plant engineering rather than bottom-up synthetic cell construction.
Computational protein design is an engineering methodology described in a 2018 review as a next-generation tool for expanding synthetic biology applications. The supplied evidence frames it as a design approach used alongside phage display and high-throughput binding assays rather than as a single molecular reagent.
Bottom-up engineered cells would likely require designed components rather than inherited natural ones, and computational protein design is plausibly useful for creating such bespoke parts. That makes it a possible enabling method for reducing dependence on evolved cellular machinery.
Assumptions: Assumes the gap includes need for de novo or customized functional components.
Missing evidence: The supplied evidence does not connect this method to synthetic-cell chassis building, membrane systems, or nanoscale fabrication applications.
Biological Life is Our Only Working Example of Complex Evolved Computation
Computation
0 capabilities · 5 candidate tools
Biological systems are the sole example we have of complex, evolved computation. Replicating this level of complexity in digital systems could unlock entirely new computational paradigms.
Candidate tools
This computation method is a predictive design framework for transcriptional programs reported in Performance Prediction of Fundamental Transcriptional Programs. It uses experimentally characterized single-input logical operations and associated metrology to model and predict the performance of more complex compressed transcriptional logic programs, including two-input AND, NOR, and mixed-phenotype NIMPLY operations.
This item directly addresses design and prediction of transcriptional logic programs, which is one concrete route toward reproducing biologically inspired computation. It is more relevant than generic modeling because the supplied evidence explicitly covers prediction of multi-input logic behavior.
Assumptions: Assumes the gap can be partially addressed by building and predicting biological-style logic programs rather than full evolved computation.
Missing evidence: No evidence here that the framework captures evolutionary dynamics, open-ended complexity growth, or digital-system implementation.
GUBS (Genomic Unified Behavior Specification) is a domain-specific, rule-based declarative language for behavioral specification of synthetic biological devices. It represents device programs as behavioral specifications for open systems rather than as complete closed-system descriptions.
GUBS is a rule-based language for specifying behaviors of synthetic biological devices in open systems, which is plausibly useful for formalizing aspects of biological computation. It could help structure and compile complex behavioral programs even if it does not itself recreate evolved complexity.
Assumptions: Assumes formal specification of open-system biological behaviors is a useful intermediate step for the gap.
Missing evidence: No direct evidence that GUBS supports evolved computation, adaptive search, or transfer to digital computational paradigms.
Boolean logic gates are synthetic genetic circuits that integrate multiple biological inputs into a defined output state. The supplied evidence indicates that such circuits have been developed with up to six inputs and are discussed as components within synthetic circuits of varying complexity.
Synthetic Boolean logic gates are one actionable way to instantiate multi-input biological computation. They are relevant to the computation theme, but the supplied evidence only supports relatively bounded circuit complexity rather than evolved, open-ended computation.
Assumptions: Assumes the gap includes near-term construction of biological computing primitives.
Missing evidence: No evidence here for scalability beyond limited input counts, evolutionary adaptation, or relevance to digital replication of biological complexity.
Mathematical and statistical modelling is a computational design approach used in synthetic biology to improve the predictability of engineered biological systems. In the cited plant synthetic biology literature, it supports model-informed rational design for engineering plant gene regulation and metabolism.
Mathematical and statistical modelling could support abstraction and prediction of biological system behavior, which is relevant to understanding complex biological computation. However, the supplied evidence is broad and tied to predictability in engineered biology rather than to evolved computation specifically.
Assumptions: Assumes general modeling tools are acceptable as enabling methods for the gap.
Missing evidence: Mechanistic evidence is generic; no direct link to complex evolved computation, adaptive dynamics, or digital emulation.
Model-informed rational design is an engineering method in synthetic biology that uses models to guide the design of biological systems. In the cited plant context, it has been successfully applied to engineering plant gene regulation and metabolism.
Model-informed rational design is a plausible enabling method for engineering systems that exhibit more sophisticated biological information processing. Its relevance is limited because the evidence is broad and focused on plant gene regulation and metabolism rather than computation as such.
Assumptions: Assumes the gap includes incremental engineering of biologically inspired computational systems.
Missing evidence: No specific mechanism or demonstrated use for evolved computation, circuit complexity, or digital computational paradigms.
Most of the Human Brain Remains Inaccessible
Neuroscience
0 capabilities · 5 candidate tools
Large portions of the living human brain are difficult to observe and modulate with current technologies. Safer, noninvasive, or minimally invasive methods are needed to capture real-time brain state information.
One funding program dedicated to making advancements in this space is that of ARIA (UK science R&D agency), which launched the Scalable Neural Interfaces opportunity space to support a new suite of tools to interface with the human brain at scale.
Candidate tools
NIR light-based imaging is an optical assay and photoregulation approach that uses near-infrared light to sense, and in some cases modulate, specific cellular events in living systems. The cited review describes these strategies as enabling real-time interrogation of deep tissues with subcellular accuracy.
The gap centers on real-time access to deep brain state information, and this item is explicitly described as enabling real-time interrogation of deep tissues using near-infrared light. It is one of the few supplied items that directly matches the sensing side of minimally invasive brain interfacing.
Assumptions: Assumes deep-tissue optical interrogation could be relevant to brain interface development.
Missing evidence: No brain-specific, human-specific, safety, penetration-depth, or performance data are provided.
NIR light-based photoregulation is an engineering approach that uses near-infrared light to sense and/or modulate specific cellular events in living systems. It is described as supporting real-time operation in deep tissues with subcellular accuracy.
The gap also includes the need to modulate inaccessible brain regions, and this item is described as using NIR light to modulate cellular events in living systems with deep-tissue operation. That makes it a plausible minimally invasive actuation strategy, at least in principle.
Assumptions: Assumes cellular photoregulation methods could be adapted toward neural modulation.
Missing evidence: No neuron-specific, brain-specific, human-use, delivery, or safety evidence is supplied.
Up-conversion phosphors are material-based light-delivery harnesses used to enable remote optogenetic control of neuronal activity in living animals. They are being explored as wireless, less invasive approaches for controlling cellular functions in the brain and other tissues.
This item is specifically framed as a less invasive, wireless light-delivery approach for remote optogenetic control of neuronal activity in living animals. That directly aligns with the modulation side of the brain-access gap, especially where conventional invasive optical hardware is limiting.
Assumptions: Assumes remote optical control of neurons is relevant to scalable neural interfaces.
Missing evidence: No evidence is provided for human brain use, noninvasiveness in practice, required implants/material placement, or comparative efficacy versus existing neural interface methods.
A real-time GPCR agonist sensor is a category of genetically encoded GPCR agonist detection tools defined by its ability to provide real-time information on the signalling dynamics of GPCR agonists. In the cited review, it is distinguished from integrator sensors as one of two main classes of genetically encoded GPCR agonist sensors.
Real-time sensing of neuromodulatory signaling could contribute to capturing brain state information, and this item is explicitly a real-time genetically encoded sensor for GPCR agonist dynamics. It is a narrower fit than whole-brain interface technologies but could plausibly support molecular readouts of neural state.
Assumptions: Assumes GPCR agonist dynamics are a relevant proxy for some brain states.
Missing evidence: No evidence is supplied for use in neurons, deep tissue, human brain contexts, delivery feasibility, or noninvasive readout.
Fluorescence microscopy is an imaging assay method used to detect and localize fluorescent signals in living biological specimens. In the supplied evidence, it is described for larval zebrafish as a means to achieve subcellular fluorescence localization and real-time monitoring of cell identity, fate, physiology, and organ pathophysiology.
This item supports real-time fluorescence monitoring in living specimens, so it is at least relevant to observing dynamic biological state. However, the supplied evidence is from larval zebrafish and does not address the core challenge of accessing large portions of the living human brain.
Missing evidence: Missing evidence for deep-brain access, minimal invasiveness, human applicability, and neural-interface scale.
We Can’t Take High-Resolution Movies of or Intervene in Brain Computation at the Single Neuron Level
Neuroscience
0 capabilities · 5 candidate tools
Capturing the dynamics of large brain networks at single-neuron resolution in vivo is extremely challenging. Advanced imaging methods that record fast, high-resolution activity without destructive intervention are required to unravel the complex interplay of neuronal circuits in real time.
Candidate tools
Lattice lightsheet microscopy (LLSM) is a modified light-sheet imaging platform used for three-dimensional optogenetic activation with subcellular resolution. In the cited 2022 study, it enabled high-spatiotemporal-resolution manipulation of cellular behavior, including membrane ruffling and guided cell migration.
This item directly combines high-spatiotemporal-resolution light-sheet imaging with subcellular optogenetic photoactivation, which is mechanistically aligned with the gap's need to both observe and intervene in neural activity. The evidence is not in neurons, but the integrated imaging-plus-perturbation capability is unusually close to the stated bottleneck.
Assumptions: Assumes the platform could be adapted from general cell behavior studies to neural tissue or neuronal preparations.
Missing evidence: No supplied evidence for use in brain tissue, single-neuron activity readout, or large in vivo neural networks.
Light-sheet microscopy, also termed single plane illumination microscopy, is an in vivo fluorescence imaging method tailored to larval research and embryonic imaging. The supplied evidence indicates that it can capture the full course of embryonic development from egg to larva and has been coupled with optogenetic perturbation to study Wnt signaling during embryogenesis.
Light-sheet microscopy is an in vivo fluorescence imaging method and the supplied evidence also notes coupling to optogenetic perturbation, matching the gap's dual need for recording and intervention. Its volumetric imaging orientation could plausibly support higher-resolution movies of many cells than standard point-scanning approaches.
Assumptions: Assumes light-sheet implementations could be adapted to neuroscience use cases.
Missing evidence: No supplied evidence for neuronal activity imaging, single-neuron resolution in brain circuits, or adult mammalian brain compatibility.
Nano-lantern is a bright bioluminescent protein construct platform for real-time multicolor live-cell imaging. In the cited work, a previously developed yellowish-green Nano-lantern was expanded to cyan and orange luminescent variants, enabling imaging of intracellular structures, gene expression, and Ca(2+) dynamics.
Nano-lantern is a genetically encoded live-imaging construct platform with reported Ca2+ dynamics readout, which is relevant to monitoring neuronal activity. Its lack of excitation light requirement could reduce phototoxicity or optical interference during fast imaging experiments.
Assumptions: Assumes calcium dynamics reporting could be useful as a proxy for neuronal computation in some settings.
Missing evidence: No supplied evidence in neurons, in vivo brain imaging, single-neuron resolution, or compatibility with simultaneous intervention.
Live-cell imaging is an assay method used in neurons in culture and brain slices to observe dynamic cellular processes in real time. The cited studies applied it to visualize minute-scale membrane PI(3,4,5)P3 fluctuations and microtubule retrograde flow during neuronal polarization-related dynamics.
This is directly evidenced in neurons in culture and brain slices for real-time optical observation, so it has some contextual relevance to neural dynamics. It could support early-stage testing of imaging strategies for single-cell neuronal behavior, even though the supplied evidence is far from large-scale in vivo brain computation.
Assumptions: Assumes ex vivo neuronal imaging can be a stepping stone toward the stated in vivo gap.
Missing evidence: No supplied evidence for fast activity imaging, single-neuron computation in vivo, or integrated perturbation capability.
Flash-and-freeze is an assay method that induces neuronal activity with a flash of light and captures membrane dynamics by rapid freezing. It was developed to visualize activity-evoked synaptic membrane trafficking with millisecond temporal resolution and was used to identify ultrafast endocytosis during neurotransmission.
This method is directly neuronal and offers millisecond temporal resolution with light-triggered stimulation, so it addresses part of the intervention-and-timing problem. However, it captures frozen snapshots of membrane dynamics rather than non-destructive high-resolution movies of ongoing brain computation.
Assumptions: Assumes destructive snapshot methods may still inform parts of the temporal bottleneck.
Missing evidence: No supplied evidence for in vivo large-network recording, repeated movie acquisition, or single-neuron functional readout across brain circuits.
Our Measurements and Tests Aren’t Revealing What Is Actually Causing Many Diseases
Physiology and Medicine
0 capabilities · 5 candidate tools
Our understanding of human physiology and disease remains incomplete. In the last century, we have developed cures for many diseases with well-defined root causes (polio, smallbox, cholera, SMA, cervical cancer, etc.). However, a wide array of conditions still eludes cures and treatments. We have yet to fully decipher the dynamic interplay between brain and peripheral systems, the bioenergetic processes underlying chronic conditions, and the multifactorial pathways that drive aging. The biological mechanisms driving complex diseases and the aging process are multifactorial, involving multiple interacting pathways.
Although we understand some individual aging mechanisms, we do not yet have line of sight to comprehensively rejuvenating mammals or extending lifespan. To overcome these challenges, we need combinatorial approaches that can modulate multiple mechanisms simultaneously, allowing us to measure multi-system impacts and develop effective interventions.
Candidate tools
Synthetic circuits are multi-component biological switch systems engineered to interrogate endogenous signaling circuitry and to couple detected biological events to defined outputs. The cited source describes their use as detectors and as temporally or spatially controlled inducers through signal integration logic.
The gap emphasizes that complex diseases arise from interacting pathways and need measurements that capture multi-system logic. Synthetic circuits are explicitly described as detectors that interrogate endogenous signaling and couple detected events to defined outputs, which could help build readouts for combinatorial disease states rather than single markers.
Assumptions: Assumes engineered detector circuits are acceptable as measurement tools for mechanistic studies rather than direct therapies.
Missing evidence: No supplied evidence on use in human disease models, aging contexts, or whole-organism multi-system measurement.
Signal integration circuits are synthetic multi-component biological switches that combine multiple input modalities to produce an output only under a precise pre-programmed set of conditions. The cited literature describes them as tools for event detection and for temporally or spatially controlled induction.
This gap calls for approaches that can resolve multifactorial mechanisms across interacting pathways. Signal integration circuits are described as event-detection tools that combine multiple inputs into a defined output, making them a plausible way to detect higher-order biological states that simpler assays miss.
Assumptions: Assumes the intended use is mechanistic sensing of combined pathway states.
Missing evidence: No direct evidence here for deployment in mammalian physiology, aging, or disease-causality studies.
Boolean logic gates are synthetic genetic circuits that integrate multiple biological inputs into a defined output state. The supplied evidence indicates that such circuits have been developed with up to six inputs and are discussed as components within synthetic circuits of varying complexity.
Boolean logic gates could help operationalize multifactorial disease hypotheses by requiring combinations of biological inputs before producing a readout. That is relevant when the bottleneck is that single measurements fail to reveal causal multi-pathway states.
Assumptions: Assumes multi-input genetic logic can be adapted as a research assay rather than only as a control module.
Missing evidence: No supplied evidence for specific disease-measurement applications, physiological systems coverage, or in vivo performance.
NIR light-based imaging is an optical assay and photoregulation approach that uses near-infrared light to sense, and in some cases modulate, specific cellular events in living systems. The cited review describes these strategies as enabling real-time interrogation of deep tissues with subcellular accuracy.
The gap specifically highlights incomplete measurement of dynamic interactions across tissues and systems. NIR light-based imaging is described as enabling real-time interrogation of deep tissues in living systems, so it could plausibly improve observation of otherwise inaccessible physiological events.
Assumptions: Assumes deep-tissue optical sensing is relevant to the disease mechanisms of interest.
Missing evidence: The supplied evidence does not specify which cellular events can be measured, multiplexing capacity, or demonstrated relevance to complex disease or aging.
The high-throughput online monitoring system with an LED array is an assay platform for screening light-controlled gene expression conditions by individually illuminating each well in a multiwell format. In the cited yeast study, it was used with photocaged Cu2+ to regulate the Cu2+-inducible pCUP1 promoter from Saccharomyces cerevisiae and monitor eYFP expression.
This is one of the few explicitly high-throughput assay platforms in the set, and the gap mentions the need for combinatorial approaches. It could plausibly support rapid screening of multi-condition perturbation/measurement schemes, even though the provided example is narrow and light-control specific.
Assumptions: Assumes the gap includes early-stage assay development and screening rather than direct clinical measurement.
Missing evidence: Evidence is limited to a yeast light-controlled expression setup; no evidence for mammalian disease models, causal mechanism discovery, or broader physiological readouts.
Limited Microbial Hosts/Chassis Organisms
Synthetic Biology
0 capabilities · 5 candidate tools
Scientists are constrained to a small number of microbial hosts for bioproduction, limiting the diversity and efficiency of engineered biological systems. Expanding the repertoire of microbial hosts could unlock novel biochemical pathways, enabling the production of a wider array of biomolecules and improving the efficiency of biosynthetic processes. It is important to address any biosafety and biosecurity risks associated with developing such technologies.
Candidate tools
Dynamic regulation is a metabolic engineering method that modulates gene expression over time to rebalance metabolic fluxes in response to changing cellular or fermentation conditions. It is used to build responsive cell factories rather than relying on fixed static control.
If new microbial chassis are hard to use because pathway expression and flux balance are host-dependent, dynamic regulation could help tune expression over time rather than relying on static designs. That makes it a plausible method for adapting biosynthetic programs to nonstandard hosts.
Assumptions: Assumes the bottleneck in expanding chassis includes poor pathway balancing after transfer into new microbes.
Missing evidence: No direct evidence here that the method was demonstrated in novel or non-model microbial hosts.
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
Mathematical modeling could support rational design when porting circuits or pathways into less familiar microbial hosts, reducing trial-and-error in chassis development. It is especially plausible as a lightweight first-pass method for comparing design options before building strains.
Assumptions: Assumes computational design support is useful for host expansion even though the item is not host-specific.
Missing evidence: No direct evidence here for modeling specifically enabling expansion to new microbial chassis or improving host selection.
Synthetic multienzyme complexes are engineered multienzyme assemblies, spanning static enzyme nanostructures to dynamic enzyme coacervates, that arrange sequential biosynthetic enzymes in metabolon-like configurations. Some complexes assembled inside microbial cells can be isolated as independent catalytic entities and used for ex vivo biosynthesis.
For alternative microbial hosts where pathway flux is limited by poor enzyme organization, synthetic multienzyme complexes could partially decouple production performance from native intracellular context. That could make some nonstandard hosts more usable for bioproduction.
Assumptions: Assumes pathway performance in new hosts is limited by enzyme colocalization or intermediate loss.
Missing evidence: No direct evidence here that these complexes were used to establish or broaden microbial chassis compatibility.
Metabolons are multienzyme assemblies that colocate sequential enzyme active sites to improve pathway flux. Synthetic metabolon-like complexes have been reported in microbial systems and, in some cases, can be isolated as independent entities that catalyze biosynthetic reactions ex vivo.
Metabolon-style enzyme colocalization could help when a candidate microbial chassis supports pathway expression but not efficient flux. This is a plausible pathway-optimization strategy for making additional hosts more productive.
Assumptions: Assumes the gap includes improving productivity of newly considered hosts, not only discovering them.
Missing evidence: No direct evidence here for use in chassis expansion or for performance across diverse microbial hosts.
CRISPR/Cas9 is a bacterial type II genome editing system repurposed as a programmable nuclease for target DNA cleavage and site-specific genome modification. The supplied evidence states that it was engineered for gene editing in mammalian cells by 2013 and is used to interrupt gene expression through cleavage of target DNA.
Programmable genome editing is often needed to domesticate or engineer new microbial hosts, so CRISPR/Cas9 is plausibly relevant as a chassis-enabling tool. It could help create targeted modifications once a candidate host is genetically tractable.
Assumptions: Assumes the gap includes engineering newly adopted microbes rather than only identifying them.
Missing evidence: The supplied evidence emphasizes mammalian-cell engineering and does not provide direct evidence for use in diverse microbial chassis.
Cellular and Biomolecular States Are
Highly Multimodal and Complex
Cellular and Molecular Biology
0 capabilities · 4 candidate tools
Cellular state is a multifaceted and complex phenomenon, involving multiple overlapping omics layers that vary in time and space. Capturing and representing this multimodal complexity is essential for predictive modeling of cell behavior and for advancing our understanding of cellular function.
Candidate tools
Live-cell imaging is an assay method used in neurons in culture and brain slices to observe dynamic cellular processes in real time. The cited studies applied it to visualize minute-scale membrane PI(3,4,5)P3 fluctuations and microtubule retrograde flow during neuronal polarization-related dynamics.
This assay directly addresses the temporal aspect of complex cellular state by observing dynamic processes in real time. It can help capture one important modality of state variation, especially time-resolved cellular behavior, even though the provided evidence does not show integrated multimodal readouts.
Assumptions: Assumes dynamic state measurement is a useful subproblem for this gap.
Missing evidence: No evidence here for multiplexed multi-omic integration, spatial omics, or broad multimodal state representation beyond optical dynamics.
Markov State Modeling (MSM) is a computational method applied with molecular dynamics simulations to resolve conformational dynamics in the AsLOV2 photosensory domain. In the cited 2023 study, MSM was used to explain blue-light-induced stepwise unfolding of the C-terminal Jα-helix and to identify seven structurally distinguishable unfolding states spanning initiation and post-initiation phases.
This method is explicitly about decomposing complex conformational dynamics into distinguishable states, which is relevant to representing biomolecular state complexity. It is a plausible fit for the biomolecular side of the gap, but the supplied evidence is limited to a specific photosensory protein system rather than general multimodal cellular state modeling.
Assumptions: Assumes the gap includes biomolecular conformational state representation, not only cell-level omics integration.
Missing evidence: No evidence for integration across multiple data modalities, cell-scale state modeling, or applicability beyond the cited protein example.
Molecular dynamics simulation is a computational method for modeling atomistic conformational dynamics of proteins and analyzing residue fluctuations and vibrational behavior. In the cited studies, it was used as a noninvasive approach to validate dynamic behavior and to compare PAS-domain dynamics across functional groups.
This method can characterize biomolecular state landscapes and dynamics at atomistic resolution, which is relevant to one layer of complex biomolecular state. It may help on the mechanistic representation side, but the provided evidence does not support multimodal cellular-state capture or integration across omics layers.
Assumptions: Assumes biomolecular dynamics are in scope as one component of the stated multimodal complexity problem.
Missing evidence: No evidence for multimodal data fusion, spatial or temporal cell-state integration, or direct use for predictive modeling of whole-cell behavior.
The theoretical probability of neighbor density (PND) is a computational method introduced to discern protein oligomeric states in cellular environments. It is described as robust, precise, and adaptable for analyzing oligomerization scenarios spanning monomers to hexamers.
This computational method helps distinguish protein oligomeric states in cellular environments, which could contribute one interpretable biomolecular modality within a broader state-representation workflow. Its relevance is narrow because the evidence only supports oligomerization-state analysis, not general multimodal cellular-state capture.
Assumptions: Assumes resolving specific biomolecular sub-states is a useful component of the broader gap.
Missing evidence: No evidence for combining this with other omics or imaging modalities, or for broader cell-state modeling.
Understanding Life as a Far-From-Equilibrium Physical Phenomenon
Biophysics
0 capabilities · 4 candidate tools
Our ability to analyze organisms holistically as systems which emerge from fundamental physics is limited by our lack of formal frameworks for distinguishing living and nonliving systems which are precise enough to be useful for practical scientific problems
Candidate tools
GUBS (Genomic Unified Behavior Specification) is a domain-specific, rule-based declarative language for behavioral specification of synthetic biological devices. It represents device programs as behavioral specifications for open systems rather than as complete closed-system descriptions.
The gap is explicitly about lacking formal frameworks for distinguishing living from nonliving systems in a practically useful way. GUBS is a rule-based declarative language for behavioral specification of open systems, which is at least directionally aligned with building formal, operational descriptions of life-like system behavior under non-equilibrium conditions.
Assumptions: Assumes a formal specification language for open biological systems is relevant to this gap even though the item is framed for synthetic biological devices.
Missing evidence: No evidence is provided that GUBS has been used to distinguish living versus nonliving systems, model far-from-equilibrium physics, or analyze whole organisms holistically.
High-resolution live imaging is cited as a methodological approach that provides new opportunities to study branching morphogenesis in living systems. The supplied evidence identifies it only at the level of a general live-imaging assay and does not specify the imaging modality, reporter strategy, or biological model.
A practical route toward far-from-equilibrium descriptions of living systems is to capture time-resolved dynamics in intact living samples. High-resolution live imaging could help generate dynamic observational data needed to compare emergent behaviors of living systems, although the item is only described very generally.
Assumptions: Assumes dynamic live imaging data would be useful for constructing or testing physical frameworks of living systems.
Missing evidence: The evidence does not specify imaging modality, measurable observables, quantitative outputs, organismal scale, or any use for equilibrium or non-equilibrium analysis.
NIR light-based imaging is an optical assay and photoregulation approach that uses near-infrared light to sense, and in some cases modulate, specific cellular events in living systems. The cited review describes these strategies as enabling real-time interrogation of deep tissues with subcellular accuracy.
The item supports real-time sensing of cellular events in living systems, which could in principle provide dynamic measurements relevant to non-equilibrium behavior. Its deep-tissue imaging angle may be useful for more holistic observation, but the supplied evidence does not connect it to the conceptual bottleneck in the gap.
Assumptions: Assumes real-time deep-tissue imaging is relevant to studying emergent physical behavior in living systems.
Missing evidence: No evidence is provided for specific observables, quantitative physical readouts, organism-level analysis, or use in distinguishing living from nonliving systems.
Fluorescence resonance energy transfer-fluorescence lifetime imaging microscopy (FRET-FLIM) is an assay method that combines fluorescence resonance energy transfer with fluorescence lifetime imaging to detect molecular proximity in living cells. In the cited Arabidopsis study, it was used to support a physical interaction between CRY2 and SPA1 in nuclei.
FRET-FLIM can quantify molecular proximity in living cells, so it could contribute fine-grained dynamic measurements of interaction networks that underlie emergent non-equilibrium behavior. However, it addresses only a narrow measurement layer and not the gap's need for a formal framework.
Assumptions: Assumes molecular interaction dynamics are a useful component of broader physical descriptions of living systems.
Missing evidence: No evidence is provided for system-level applicability, non-equilibrium analysis, or use beyond a specific protein interaction case.
Insufficient Surveillance of Bio-Threats
Biosecurity
0 capabilities · 3 candidate tools
Rapid detection of emerging bio-threats is critical for effective intervention and containment. However, many pathogens are detected only after widespread transmission has occurred. In addition, attributing the source of these threats remains challenging.
Candidate tools
CRISPR-Cas technology comprises CRISPR-associated effector proteins that recognize specific DNA or RNA sequences and cleave them. In the cited review, it is presented primarily as a platform for rapid pathogen nucleic acid detection that leverages Cas trans-cleavage activity together with signal amplification and signal transformation strategies.
The supplied evidence directly describes CRISPR-Cas as a platform for rapid pathogen nucleic acid detection, which matches the gap's core need for earlier detection of emerging bio-threats. Sequence-specific recognition plus trans-cleavage, signal amplification, and signal transformation are actionable diagnostic mechanisms.
Assumptions: Assumes surveillance use includes pathogen nucleic acid testing workflows.
Missing evidence: No specific field-deployable format, pathogen panel breadth, limit of detection, attribution capability, or deployment setting is provided.
Single-cell RNA sequencing (scRNA-seq) is a transcriptomic assay method that measures RNA molecules in individual cells by sequencing-based transcript detection. In the cited application, it detected FLiCRE transcripts within the endogenous transcriptome, enabling simultaneous readout of cell type and calcium activation history.
Single-cell RNA sequencing is an actionable sequencing-based detection assay and could plausibly support high-resolution characterization of infected samples or host-response signatures during surveillance investigations. It may be more useful for follow-up characterization than for primary rapid detection.
Assumptions: Assumes surveillance can include laboratory characterization workflows, not only point-of-need detection.
Missing evidence: The supplied evidence does not show pathogen surveillance use, turnaround time, field suitability, source attribution performance, or direct pathogen detection in this context.
The custom Python-based API is a software interface for assembling automation workflows on an open-source microplate reader. It enables programmable control of automated assay protocols for an instrument demonstrated for full-spectrum absorbance, fluorescence emission detection, and in situ optogenetic stimulation.
This API could plausibly help automate microplate-based assay workflows, which may support more scalable screening or surveillance lab operations. Its relevance is operational rather than a direct detection mechanism.
Assumptions: Assumes surveillance bottlenecks include assay automation on compatible plate-reader hardware.
Missing evidence: No evidence is provided for pathogen detection use, integration with surveillance assays, analytical performance, or attribution applications.
Fundamental Biomolecular Actors in Cells Remain Largely Invisible
Cellular and Molecular Biology
0 capabilities · 3 candidate tools
Many of the fundamental actors in cells—proteins, lipids, and metabolites—are still mostly invisible to us, especially when considering their extensive multiplexity, diversity, cell-to-cell heterogeneity, and temporal variation. Without scalable, cost-effective technologies to capture these molecular details, our comprehensive analysis of complex biological systems remains limited.
Candidate tools
Microfluidic single-cell analysis is an assay method used during microfluidic cultivation to quantify growth behavior and expression phenotypes at single-cell resolution. In the cited 2016 E. coli study, it was applied comparatively across PT7lac/LacI, PBAD/AraC, and Pm/XylS expression systems to reveal dynamic and spatiotemporal heterogeneity in recombinant protein production.
This assay directly addresses the gap's emphasis on cell-to-cell heterogeneity and temporal variation by measuring single-cell phenotypes with spatiotemporal resolution. While the supplied evidence is focused on recombinant protein expression in E. coli rather than broad biomolecule profiling, it is still a plausible tool pattern for making otherwise hidden cellular variation more observable.
Assumptions: Assumes single-cell phenotypic readouts are relevant proxies for part of the invisibility problem.
Missing evidence: No evidence here for direct measurement of proteins, lipids, or metabolites beyond expression phenotypes; no scalability or cost-effectiveness data.
Single cell FRET measurements with Rho GTPase biosensors are a quantitative cell-based assay used in primary human endothelial cells to monitor guanine nucleotide exchange factor activity toward Cdc42 and Rac1. In the cited study, the method was applied to compare the cellular activities of overexpressed endothelial GEFs.
This is a concrete single-cell biosensor assay that makes otherwise hidden signaling activity visible in living cells. It fits the gap's need for temporal and cell-to-cell resolution, though the supplied evidence is narrow to Rho GTPase activity rather than general multiplexed detection of proteins, lipids, or metabolites.
Assumptions: Assumes targeted live-cell activity sensing is a useful partial solution to molecular invisibility.
Missing evidence: No evidence for multiplexing, metabolite or lipid readouts, or scalable low-cost deployment.
Single cell-based analysis is a quantitative cellular assay framework developed to compare the activities of overexpressed full-length guanine nucleotide exchange factors in primary human endothelial cells. It was applied with single-cell FRET Rho GTPase biosensors to measure GEF-driven activation of Cdc42 and Rac1.
This framework is specifically built for quantitative single-cell activity measurement and therefore aligns with the gap's concern about heterogeneity across cells. It could help reveal hidden functional variation in selected molecular pathways, but the evidence only supports a focused FRET-based application in endothelial cells.
Assumptions: Assumes a single-cell assay framework can generalize beyond the specific GEF-Rho GTPase use case.
Missing evidence: No direct evidence for broad biomolecule coverage, multiplexity, metabolite or lipid detection, or high-throughput scalability.
Limited Ability to Image Molecules in Their Native Contexts
Cellular and Molecular Biology
0 capabilities · 3 candidate tools
We are currently limited in our ability to image molecules in their native contexts—for example, within live 3D tissues. Achieving scalable, high-resolution imaging of biomolecules in situ would de-risk many areas of biomedical science by enabling integrative, comprehensive molecular mapping within intact specimens.
Candidate tools
NIR light-based imaging is an optical assay and photoregulation approach that uses near-infrared light to sense, and in some cases modulate, specific cellular events in living systems. The cited review describes these strategies as enabling real-time interrogation of deep tissues with subcellular accuracy.
This item is directly an imaging approach, and the supplied summary specifically says it enables real-time interrogation of deep tissues with subcellular accuracy. That is a plausible match to the gap's emphasis on imaging biomolecules in native contexts such as live 3D tissues.
Assumptions: Assumes the cited NIR strategies include molecularly specific readouts rather than only generic tissue imaging.
Missing evidence: No specific reporter chemistry, specimen type, demonstrated molecular targets, or scalability details are provided.
Lattice lightsheet microscopy (LLSM) is a modified light-sheet imaging platform used for three-dimensional optogenetic activation with subcellular resolution. In the cited 2022 study, it enabled high-spatiotemporal-resolution manipulation of cellular behavior, including membrane ruffling and guided cell migration.
LLSM is a concrete 3D light-sheet imaging platform with high spatiotemporal resolution, so it plausibly helps with imaging within intact native-like volumes. Its optical sectioning and 3D operation are relevant to the gap's live tissue context.
Assumptions: Assumes the platform can be used for imaging readout in addition to the optogenetic activation use described.
Missing evidence: The supplied evidence emphasizes photoactivation rather than direct biomolecule imaging, and does not specify molecular labeling strategy, tissue depth performance, or scalability.
Live-cell imaging is an assay method used in neurons in culture and brain slices to observe dynamic cellular processes in real time. The cited studies applied it to visualize minute-scale membrane PI(3,4,5)P3 fluctuations and microtubule retrograde flow during neuronal polarization-related dynamics.
This is a directly relevant assay class for observing dynamic molecular processes in real time. It could support initial efforts toward native-context imaging, although the provided evidence is limited to cultured neurons and brain slices rather than intact 3D tissues.
Assumptions: Assumes the live-cell imaging setup can be paired with suitable molecular reporters.
Missing evidence: No modality, resolution, depth, multiplexing, or intact-tissue performance is specified.
Difficulty Delivering Physical Probes for Imaging into Living Cells
Biophysics
0 capabilities · 3 candidate tools
Delivering physical probes for imaging into living cells is challenging due to barriers in cell membranes and potential perturbation of cellular function. New approaches are required to enable high-dimensional biosensing without invasive probes.
Candidate tools
Photocontrollable nucleic acid displacement reaction is a light-gated nucleic acid engineering method used within a near-infrared-activatable cascade recycling amplification system. In the cited implementation, it is integrated with exonuclease III-assisted nucleic acid amplification and upconversion nanoparticles to enable spatiotemporally controllable, signal-amplified mRNA imaging in selected living cancer cells.
This item is directly described as enabling mRNA imaging in living cells using an externally integrated nucleic-acid circuit with light-gated activation and nanoparticle support. That makes it a plausible route to intracellular biosensing without relying on invasive physical imaging probes.
Assumptions: Treating nucleic-acid imaging circuits plus nanoparticle integration as a substitute strategy for hard-to-deliver physical probes.
Missing evidence: The summary does not specify how the probe system itself enters cells, delivery efficiency across cell types, or perturbation burden.
Exonuclease III assisted nucleic acid cascade recycling amplification is an engineered nucleic acid signal amplification method used within a near-infrared light-activatable circuit. In the cited implementation, it is combined with a photocontrollable nucleic acid displacement reaction and upconversion nanoparticles to enable spatiotemporally controllable amplified mRNA imaging in living cancer cells.
The supplied evidence says this amplification circuit was used for mRNA imaging in living cancer cells, which is directly relevant to intracellular imaging without conventional invasive probes. Signal amplification could also help reduce the need for large or highly perturbative probe payloads.
Assumptions: Assumes amplified intracellular nucleic-acid sensing is relevant to the gap's high-dimensional biosensing goal.
Missing evidence: No explicit evidence is provided on cellular entry method, generalizability beyond the cited implementation, or effects on cell physiology.
Up-conversion phosphors are material-based light-delivery harnesses used to enable remote optogenetic control of neuronal activity in living animals. They are being explored as wireless, less invasive approaches for controlling cellular functions in the brain and other tissues.
Up-conversion phosphors are an externally supplied materials-based delivery harness intended to provide less invasive remote optical access in living tissues. They could plausibly support imaging-related probe activation or readout strategies that avoid direct physical probe insertion.
Assumptions: Assumes remote light-delivery materials may help address the need for less invasive intracellular imaging strategies.
Missing evidence: The evidence is for optogenetic control, not direct delivery of imaging probes into cells, and does not show intracellular imaging use.
Lack of Applied Synthetic Biology Platforms
Synthetic Biology
0 capabilities · 3 candidate tools
Applied synthetic biology is underutilized in applications such as building sustainable food systems and repairing the environmental damage caused by conventional agriculture and industry. Despite advances in tools and chassis engineering, there are few robust platforms that translate synthetic biology into scalable, field-ready solutions. This includes not only the production of low-impact proteins and agricultural inputs but also bioremediation technologies for legacy pollutants—such as pesticide-laden soils, heavy metals, and nutrient runoff—that degrade ecosystems and constrain land use. A new generation of synthetic biology platforms is needed to address both sides of the problem: replacing harmful production methods and cleaning up their long-term consequences.
Candidate tools
Dynamic regulation is a metabolic engineering method that modulates gene expression over time to rebalance metabolic fluxes in response to changing cellular or fermentation conditions. It is used to build responsive cell factories rather than relying on fixed static control.
The gap emphasizes robust, scalable applied platforms, and dynamic regulation is directly aimed at building responsive cell factories rather than static constructs. That could plausibly improve production stability or environmental responsiveness in sustainable biomanufacturing and remediation settings.
Assumptions: Assumes platform bottlenecks include instability or poor control of engineered metabolism over time.
Missing evidence: No direct evidence here for agricultural, environmental, field-deployed, or specific bioremediation use cases.
Model-informed rational design is an engineering method in synthetic biology that uses models to guide the design of biological systems. In the cited plant context, it has been successfully applied to engineering plant gene regulation and metabolism.
A shortage of applied platforms can stem from weak design-to-function predictability, and model-informed rational design is an actionable method for guiding system design. The supplied evidence is in plants, which is at least directionally relevant to agricultural synthetic biology.
Assumptions: Assumes the gap includes a need for better design workflows for plant or agriculture-adjacent systems.
Missing evidence: No direct evidence for scalable deployment, bioremediation, sustainable protein production, or field performance.
Computational protein design is an engineering methodology described in a 2018 review as a next-generation tool for expanding synthetic biology applications. The supplied evidence frames it as a design approach used alongside phage display and high-throughput binding assays rather than as a single molecular reagent.
The item is explicitly framed as a next-generation tool for expanding synthetic biology applications, so it could plausibly support creation of new functional proteins or pathways for applied platforms. It is most relevant as an upstream engineering method rather than a field-ready platform itself.
Assumptions: Assumes new applied platforms may require de novo or optimized protein functions.
Missing evidence: No direct evidence for food systems, environmental cleanup, agricultural inputs, or deployment in production chassis.
Under-Provisioning of Antibiotics, Vaccines and Other Interventions for Major Global Health Challenges
Global Health
0 capabilities · 3 candidate tools
Many of the world’s most deadly diseases—such as tuberculosis, Group A Streptococcus, hepatitis C, hepatitis B, and syphilis—lack effective vaccines or cures. Additionally, the pace of developing effective, low-cost, therapeutics for emerging pathogens in low resource settings is too slow to meet global health needs. Malnutrition exacerbates susceptibility to disease and impedes recovery; food security is important especially for early child development. Understanding the basic science of malnutrition during development is important for the development of more effective interventions.
Candidate tools
CRISPR-based biosensors are molecular diagnostic constructs that use CRISPR systems for sequence-specific nucleic acid recognition to detect disease-associated targets. A 2023 review presents them as a strategy for detecting emerging infectious diseases.
The gap explicitly includes slow development of effective interventions for emerging pathogens in low-resource settings, and CRISPR-based biosensors are directly described as molecular diagnostics for detecting emerging infectious diseases. Faster, sequence-specific detection can support outbreak response and intervention deployment even though this item is diagnostic rather than therapeutic.
Assumptions: Assumes diagnostics count as plausible intervention-enabling tools for this global health gap.
Missing evidence: No supplied evidence on low-resource deployment performance, cost per test, field robustness, or specific pathogen coverage.
qRT-PCR is a quantitative reverse-transcription PCR assay used to measure transcript abundance, here applied to GFP mRNA during light-controlled gene expression in Synechococcus sp. PCC 7002. In the cited study, it quantified transcriptional activation and deactivation kinetics of optogenetic systems under green/red and light/dark illumination cycles.
qRT-PCR is an actionable nucleic-acid assay that could be used to measure pathogen or host transcripts during development and evaluation of infectious-disease interventions. It is a relatively standard assay category and could support faster first-pass testing workflows.
Assumptions: Assumes the gap includes enabling assays for intervention development, not only end-product therapeutics or vaccines.
Missing evidence: Supplied evidence is from optogenetic transcript measurement, not infectious disease, malnutrition, vaccine, or antibiotic development.
Spatial-temporal control of bioactive drug release is an engineering method for delivering developmental cytokines, growth factors, and other bioactive factors in defined spatial and temporal patterns. In the cited 2019 review, it is described as a well-developed approach that makes biomimetic release strategies more feasible for tooth regeneration.
Controlled-release delivery strategies can in principle improve how bioactive factors are administered, which is relevant to intervention design. However, the supplied evidence is specifically about tooth regeneration and does not directly address vaccines, antibiotics, infectious disease therapeutics, or malnutrition.
Assumptions: Assumes a general drug-delivery method may be considered if evidence is clearly actionable.
Missing evidence: No supplied evidence for infectious disease use, low-cost implementation, low-resource suitability, or relevance to the named global health conditions.
Most Brain Circuitry is Still Invisible
Neuroscience
0 capabilities · 3 candidate tools
Understanding the complete wiring of the brain at single–cell resolution, along with detailed molecular annotations, is critical for revealing how neural circuits support learning, memory, and behavior. Current technologies are prohibitively expensive and lack scalability, limiting our ability to link molecular composition with circuit connectivity and to understand the alterations present in brain disorders. This gap fundamentally makes diagnosis, treatment, and prevention of many brain disorders more difficult. Beyond the biomedical applications, maps of brain circuitry could play a fundamental role in grounding principles of safety for brain-like AI systems.
Initiatives like the NIH BRAIN Initiative’s transformative projects (the BRAIN Initiative Cell Atlas Network (BICAN), the BRAIN Initiative Connectivity Across Scales (BRAIN CONNECTS) Network, and the Armamentarium for Precision Brain Cell Access) represent important efforts to illuminate foundational principles governing the circuit basis of behavior and to inform new approaches to treating human brain disorders by radically enhancing our understanding of brain cell types and the tools needed to access them (The BRAIN Initiative® 2.0: From Cells to Circuits, Toward Cures).
Candidate tools
HiRet is a lentiviral system for highly efficient retrograde gene transfer that targets specific neural circuits. It supports neural circuit-selective, stable transgene expression and has been used with optogenetic tools to manipulate neuronal activity and behavior.
This item directly targets neural circuits via retrograde gene transfer, which is relevant to making specific connectivity patterns experimentally accessible. It could support circuit-selective labeling or perturbation strategies needed to relate cell identity to wiring, although the supplied evidence emphasizes manipulation more than mapping.
Assumptions: Assumes retrograde transgene delivery could be used for circuit labeling as well as manipulation.
Missing evidence: No supplied evidence on molecular annotation, single-cell resolution, scalability, or demonstrated connectomics use.
Head mounted fluorescent microscopes are light-based imaging tools used in systems neuroscience to image neurons, neurocircuits, and their inputs and outputs. The supplied evidence places them within the recent expansion of functional imaging approaches but does not specify a particular instrument architecture or readout.
The supplied summary explicitly places this tool in systems neuroscience for imaging neurons, neurocircuits, and their inputs and outputs. That makes it plausibly useful for rendering some circuit activity visible in vivo, even though it does not by itself solve complete wiring or molecular annotation.
Assumptions: Assumes functional imaging of circuits is a partial step toward the stated visibility gap.
Missing evidence: No supplied evidence for structural connectivity mapping, single-cell comprehensive wiring reconstruction, molecular profiling integration, or scalability/cost advantages.
AAV-based viral vectors are adeno-associated virus delivery systems used to introduce optogenetic transgenes for expression in target cell types. In the cited therapeutic optogenetics context, they are presented as promising for human trials but still limited by barriers to general use.
AAV delivery can enable expression of reporters or other transgenes in target cell types, which is a plausible enabling step for circuit visualization and cell-type access. This is relevant to the gap's need to connect molecular identity with circuitry, but the provided evidence is limited to optogenetic transgene delivery rather than mapping.
Assumptions: Assumes target-cell transgene delivery could be repurposed for labeling or recording constructs.
Missing evidence: No supplied evidence for connectome mapping performance, retrograde/transsynaptic tracing, single-cell resolution, or integration with molecular annotation workflows.
We Don’t Have Easy Programmable Synthesis of Bio Polymers Other Than Nucleic Acids
Chemistry
0 capabilities · 3 candidate tools
While long-chain nucleic acid synthesis is advancing rapidly, the programmable synthesis of other polymers remains underdeveloped, limiting our capacity to design and produce diverse synthetic polymers.
Candidate tools
Genetic code expansion is an engineering method that enables incorporation of non-physiological amino acids into proteins. In the supplied evidence, it was used to design efficient incorporation systems in Bacillus subtilis and to generate a Cas9 variant that became full-length and active in cultured somatic cells only after BOC exposure.
This method directly expands programmable protein synthesis by enabling incorporation of non-physiological amino acids, which is one concrete route beyond standard nucleic-acid programming. It does not solve general biopolymer synthesis, but it plausibly addresses part of the bottleneck for programmable protein polymer composition.
Assumptions: Treating proteins as the relevant non-nucleic-acid biopolymer class for this gap.
Missing evidence: No supplied evidence on sequence-scale programmability, polymer length control, monomer diversity limits, or applicability beyond proteins.
Dynamic regulation is a metabolic engineering method that modulates gene expression over time to rebalance metabolic fluxes in response to changing cellular or fermentation conditions. It is used to build responsive cell factories rather than relying on fixed static control.
Dynamic regulation could plausibly help if the synthesis bottleneck is metabolic flux control in engineered production of non-nucleic-acid polymers. It is more of an enabling cell-factory strategy than a direct programmable polymer-synthesis method.
Assumptions: Assumes the gap includes biologically produced polymers where pathway balancing is limiting.
Missing evidence: No supplied evidence linking this method to programmable synthesis of specific non-nucleic-acid biopolymers or to monomer-by-monomer control.
Computational protein design is an engineering methodology described in a 2018 review as a next-generation tool for expanding synthetic biology applications. The supplied evidence frames it as a design approach used alongside phage display and high-throughput binding assays rather than as a single molecular reagent.
Computational protein design may support design of protein polymers or enzymes for making new polymers, which is directionally relevant to programmable non-nucleic-acid synthesis. However, the supplied evidence describes it only as a broad design methodology, not a direct synthesis platform.
Assumptions: Assumes designed proteins or enzymes could be part of the solution space for programmable polymer synthesis.
Missing evidence: No supplied mechanism or case showing direct programmable synthesis of non-nucleic-acid biopolymers.
When We Put a Molecule in the Human Body, We Can’t Predict What It Will Do
Physiology and Medicine
0 capabilities · 3 candidate tools
Drug development is often hampered by failures related to absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox). Improved predictive models for molecular interactions are essential for designing safer, more effective drugs, as well as evaluating the impact of environmental chemicals.
Additionally, there is a significant gap in our knowledge of what exactly is present in foods and how these components affect human biology. A comprehensive mapping of the “foodome” and studies on food component functionality are needed to advance nutrition science and personalized dietary interventions.
Candidate tools
TR-FRET assay is a time-resolved fluorescence resonance energy transfer binding assay used in the cited study to confirm binding of a small-molecule ligand to CIB1. In this evidence set, it functions as a chemical-input assay for ligand-binding confirmation.
This assay directly supports one part of the ADME/Tox prediction problem by experimentally confirming small-molecule binding to a target protein. It is a plausible first-pass assay for reducing uncertainty about molecular interactions, which the gap explicitly identifies as a bottleneck.
Assumptions: Assumes target-specific binding assays are useful as one component of broader predictive workflows.
Missing evidence: No evidence here on ADME, metabolism, toxicity, in vivo prediction, food-component profiling, or assay scalability beyond the cited binding-confirmation use case.
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
The gap calls for improved predictive models, and this item is explicitly a computational modeling method. It could plausibly contribute to rational prediction workflows, although the supplied evidence is from synthetic circuits, microbial drug resistance, and cancer model systems rather than ADME/Tox or foodome applications.
Assumptions: Assumes general computational modeling approaches can be adapted to molecular-behavior prediction problems.
Missing evidence: No direct evidence for pharmacokinetics, toxicology, environmental chemicals, nutrition, or food-component interaction modeling.
FRASE-bot is an in silico fragment-based hit-finding method for drug discovery against unconventional therapeutic targets. It mines thousands of 3D protein-ligand complex structures to build a fragment-in-structural-environment database, matches target protein environments to that database, and uses machine learning to prioritize seeded fragments as candidate binders.
FRASE-bot is a concrete in silico method for predicting candidate protein-ligand interactions, which is relevant to the gap's need for better predictive models of molecular interactions. It may help at the target-binding stage of drug discovery, but the provided evidence does not extend to ADME, toxicity, or food-component effects in humans.
Assumptions: Assumes early binding prediction is a useful subproblem within the broader in vivo predictability gap.
Missing evidence: No direct evidence for whole-body behavior, pharmacokinetics, toxicity prediction, environmental chemical assessment, or foodome mapping.
Designing Manufacturing Systems is Hard
Mechanical Engineering
0 capabilities · 3 candidate tools
Modern manufacturing system design remains complex, with traditional methods relying on outdated processes. AI-based design approaches have the potential to reimagine these systems without relying on the legacy of humanoid robots.
Candidate tools
GUBS (Genomic Unified Behavior Specification) is a domain-specific, rule-based declarative language for behavioral specification of synthetic biological devices. It represents device programs as behavioral specifications for open systems rather than as complete closed-system descriptions.
The gap is about the difficulty of designing complex systems, and GUBS is explicitly a rule-based declarative specification language for system behavior. That makes it a plausible design formalism for expressing and compiling manufacturing-system behaviors, even though the supplied evidence is from synthetic biology rather than manufacturing.
Assumptions: Assumes a behavioral specification language can transfer from biological device design to manufacturing-system design workflows.
Missing evidence: No supplied evidence shows use in mechanical engineering, manufacturing systems, AI-based design, robotics-free system design, or industrial deployment.
Feedback linearisation control is an engineering control method designed to regulate an in-wheel motor drive system. In the cited study, it is implemented within a DSP-based control architecture for a light electric vehicle drive using a six-phase permanent magnet synchronous motor.
Manufacturing systems often include tightly controlled electromechanical subsystems, and feedback linearisation control is an actionable control-design method with direct mechanical-engineering relevance. It could help at the subsystem control layer, but it does not directly solve the broader system-design bottleneck described in the gap.
Assumptions: Assumes the manufacturing-system challenge includes control architecture for nonlinear machinery or drive systems.
Missing evidence: No supplied evidence connects this method to manufacturing-system layout/design, AI-based design, factory-level optimization, or non-humanoid automation.
A non-linear disturbance observer (NDO) is an engineering control method used within a feedback linearisation framework to estimate lumped uncertainty. In the cited in-wheel motor drive study, it was implemented as part of a DSP-based intelligent control system for a six-phase permanent magnet synchronous motor.
A non-linear disturbance observer is a concrete control method for handling uncertainty in electromechanical systems, which can matter in manufacturing equipment design. It is only a partial fit because the gap emphasizes overall manufacturing-system design complexity rather than disturbance estimation in a specific motor-control context.
Assumptions: Assumes the target manufacturing systems include nonlinear machine dynamics where uncertainty estimation is a bottleneck.
Missing evidence: No supplied evidence for use in manufacturing-system design, AI-driven design workflows, plant-scale coordination, or broader production-system architecture.
Current “Model Systems” for Brain Function are Not Representative of the Real Human Brain
Neuroscience
0 capabilities · 3 candidate tools
Current in vivo and in vitro models often fail to capture human brain function. Innovative model systems—including digital reconstructions, embodied simulations, and new biological models—are needed.
Candidate tools
Mathematical and statistical modelling is a computational design approach used in synthetic biology to improve the predictability of engineered biological systems. In the cited plant synthetic biology literature, it supports model-informed rational design for engineering plant gene regulation and metabolism.
The gap explicitly calls for digital reconstructions and simulations, and this item is a computational modelling method that could support building or evaluating such models. It is not brain-specific in the supplied evidence, so its usefulness is mainly as a general modelling framework rather than a validated neuroscience solution.
Assumptions: Assumes computational modelling is in scope for digital brain model development.
Missing evidence: No evidence here for use in neuroscience, human brain systems, embodied simulation, or specific model fidelity metrics.
Model-informed rational design is an engineering method in synthetic biology that uses models to guide the design of biological systems. In the cited plant context, it has been successfully applied to engineering plant gene regulation and metabolism.
If new biological brain models need to be engineered iteratively, model-informed design could help structure that process. However, the supplied evidence is from plant synthetic biology, so the match to human brain model systems is weak and indirect.
Assumptions: Assumes the gap includes engineering new biological model systems, not only observing existing ones.
Missing evidence: No direct evidence for neural systems, organoids, brain tissue models, or human-relevant validation.
High-resolution live imaging is cited as a methodological approach that provides new opportunities to study branching morphogenesis in living systems. The supplied evidence identifies it only at the level of a general live-imaging assay and does not specify the imaging modality, reporter strategy, or biological model.
Live imaging could help characterize dynamic behavior in candidate brain model systems once they exist. But the evidence only describes a very general imaging approach in another biological context, so it does not directly solve the representativeness problem.
Assumptions: Assumes phenotyping and validation of new models is part of addressing the gap.
Missing evidence: No specified imaging modality, neural application, human brain model context, or evidence that it improves model representativeness.
Modeling Mechanical Systems is Hard
Mechanical Engineering
0 capabilities · 3 candidate tools
The simulation and modeling of complex mechanical systems is challenging due to the intricate interplay of multiple physical phenomena. Improved computational models can enhance design and optimization.
Candidate tools
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
This is the only candidate explicitly framed as a general computational modeling method, which is directionally aligned with a gap about improving simulation and modeling. It could plausibly support formulation of quantitative models for complex systems, but the supplied evidence is from synthetic biology rather than mechanical engineering.
Assumptions: Assumes a general modeling method can transfer across domains at a conceptual level.
Missing evidence: No evidence is provided for use in mechanical systems, multiphysics simulation, design optimization of engineered hardware, or benchmark performance in mechanical engineering contexts.
Molecular dynamics simulation is a computational method for modeling atomistic conformational dynamics of proteins and analyzing residue fluctuations and vibrational behavior. In the cited studies, it was used as a noninvasive approach to validate dynamic behavior and to compare PAS-domain dynamics across functional groups.
Molecular dynamics simulation is a concrete computational simulation method for complex physical behavior, so it has a weak but plausible methodological connection to hard modeling problems. However, the evidence is limited to atomistic protein dynamics, not mechanical engineering systems.
Assumptions: Assumes the gap can include borrowing simulation ideas from other physical modeling domains.
Missing evidence: No evidence for rigid-body, continuum, finite-element, multibody, or multiphysics mechanical system modeling.
Transition path sampling is a computational method applied to explicit-solvent molecular dynamics trajectories to extract atomistic features of conformational reaction networks. In the cited study, it was used to analyze the millisecond partial unfolding transition in the light-driven photocycle of photoactive yellow protein and to predict reaction coordinate models and tentative transition states.
This is a specialized computational method for extracting reaction-coordinate information from complex simulated trajectories, which loosely relates to analyzing high-dimensional dynamical systems. The supplied evidence is highly specific to protein conformational transitions, so applicability to mechanical engineering is very uncertain.
Assumptions: Assumes some dynamical-systems analysis concepts may be reusable outside the original domain.
Missing evidence: No evidence for application to mechanical systems, engineering design workflows, or coupled physical phenomena in macroscopic simulations.
Fraud in the Scientific Literature
Metascience
0 capabilities · 3 candidate tools
Scientific literature is plagued by fraudulent publications, undermining trust and slowing progress.
Candidate tools
Standardisation is an engineering method in synthetic biology in which engineering principles are applied to genetic manipulation workflows. The cited literature states that standardisation, together with key technical advances, enabled major gains in the speed and accuracy of genetic manipulation.
Standardisation could plausibly help reduce opportunities for irreproducible or opaque experimental workflows by making methods more uniform and auditable. For a metascience problem centered on trust in the literature, this is one of the few candidate items that directly concerns research process rather than a specific biological intervention.
Assumptions: Assumes the gap includes process-level interventions to improve literature reliability, not only fraud detection.
Missing evidence: No supplied evidence shows direct use for detecting, preventing, or auditing scientific fraud specifically.
High-throughput screening is an assay method cited in microbial biotechnology literature as part of the CRISPR/Cas toolbox for evaluating variants generated by multiplexed engineering. In the supplied evidence, it is presented as a screening approach associated with CRISPR/Cas-based metabolic engineering and with development of new dynamic systems.
High-throughput screening could weakly help if the intended intervention is large-scale empirical re-testing of published biological claims. Its fit is limited because the supplied evidence frames it as a variant-screening assay within engineering workflows, not as a metascience fraud-detection method.
Assumptions: Assumes the gap could be addressed partly through scalable replication testing of experimental claims.
Missing evidence: No evidence here links this assay to literature auditing, fraud detection, or reproducibility assessment across publications.
Mathematical and statistical modelling is a computational design approach used in synthetic biology to improve the predictability of engineered biological systems. In the cited plant synthetic biology literature, it supports model-informed rational design for engineering plant gene regulation and metabolism.
Statistical modelling could plausibly support analyses that flag anomalous patterns or improve assessment of claim reliability at scale. However, the supplied evidence only supports its use for model-informed design in plant synthetic biology, not for metascience or fraud analysis.
Assumptions: Assumes statistical modelling is being considered as a general analytic method for literature-level pattern analysis.
Missing evidence: No direct evidence for application to publication fraud, anomaly detection, or literature integrity.
Inadequate Imaging of Material Structures
Physics
0 capabilities · 3 candidate tools
Many materials’ internal structures are difficult to image with current technologies, limiting our understanding of their properties at the nanoscale.
Candidate tools
Light-sheet microscopy, also termed single plane illumination microscopy, is an in vivo fluorescence imaging method tailored to larval research and embryonic imaging. The supplied evidence indicates that it can capture the full course of embryonic development from egg to larva and has been coupled with optogenetic perturbation to study Wnt signaling during embryogenesis.
This is a concrete imaging method, so it is more directly relevant than most candidates to a gap about inadequate imaging. The supplied evidence supports high-resolution fluorescence imaging, but it is described mainly for embryos and in vivo biology rather than nanoscale internal material structure imaging.
Assumptions: Assumes fluorescence-based optical microscopy could be adapted to some material-structure imaging use cases.
Missing evidence: No evidence here on materials imaging, nanoscale resolution, non-biological samples, or internal structure performance in physics contexts.
Lattice lightsheet microscopy (LLSM) is a modified light-sheet imaging platform used for three-dimensional optogenetic activation with subcellular resolution. In the cited 2022 study, it enabled high-spatiotemporal-resolution manipulation of cellular behavior, including membrane ruffling and guided cell migration.
This is an advanced optical imaging platform with three-dimensional, high-spatiotemporal-resolution capability, which is at least directionally relevant to difficult structure imaging problems. However, the provided evidence is about subcellular optogenetic manipulation in cells, not imaging internal nanoscale structures of materials.
Assumptions: Assumes 3D optical sectioning capability could be informative for some structured material samples.
Missing evidence: No direct evidence for materials science use, nanoscale structural readout, penetration into opaque materials, or achievable structural resolution for material interiors.
Live-cell imaging is an assay method used in neurons in culture and brain slices to observe dynamic cellular processes in real time. The cited studies applied it to visualize minute-scale membrane PI(3,4,5)P3 fluctuations and microtubule retrograde flow during neuronal polarization-related dynamics.
This is a real imaging assay method and could, in a very general sense, support observation of structural dynamics. But the supplied evidence is limited to neuronal cell biology and does not show suitability for internal nanoscale material structure imaging.
Assumptions: Assumes the gap could include dynamic imaging workflows rather than only static materials characterization.
Missing evidence: No evidence for materials applications, nanoscale resolution, internal structure access, or compatibility with nonliving material samples.
Manual and Laborious Nature of Chemical Synthesis
Chemistry
0 capabilities · 3 candidate tools
Chemical synthesis remains largely manual, limiting throughput and reproducibility. The field requires robust automation to accelerate discovery and production of new molecules.
Candidate tools
In silico feedback control strategies are computationally implemented control schemes coupled to optogenetic measurement and light stimulation platforms. They are used to create computer-controlled living systems through automated measurement and stimulation workflows.
This is the only candidate explicitly centered on automation workflows and computer-controlled operation, which is directly relevant to reducing manual intervention. Its feedback-control framing could plausibly support more reproducible, automated synthesis processes, although the supplied evidence is from optogenetic measurement and stimulation rather than chemistry automation.
Assumptions: Assumes the automation and control concepts could be adapted from biological stimulation workflows to chemical synthesis operations.
Missing evidence: No direct evidence that this method has been applied to chemical synthesis, reaction orchestration, liquid handling, or synthesis hardware.
FRASE-bot is an in silico fragment-based hit-finding method for drug discovery against unconventional therapeutic targets. It mines thousands of 3D protein-ligand complex structures to build a fragment-in-structural-environment database, matches target protein environments to that database, and uses machine learning to prioritize seeded fragments as candidate binders.
FRASE-bot could reduce some manual work upstream of synthesis by computationally prioritizing fragments and candidate binders, potentially narrowing what needs to be synthesized and tested. That may improve discovery efficiency, but it does not directly automate chemical synthesis itself.
Assumptions: Assumes the gap includes reducing manual design-selection cycles in small-molecule discovery, not only bench synthesis execution.
Missing evidence: No evidence of integration with automated synthesis platforms, reaction planning, or laboratory execution.
FRASE, also described as FRASE-bot, is a computational fragment-based ligand discovery method that mines 3D ligand–protein complex structures to build a database of fragments in structural environments. It screens this database against a target protein, seeds the target structure with relevant ligand fragments, and uses a neural network to prioritize fragments with the highest likelihood of being native binders.
This computational fragment-prioritization method could help reduce manual candidate selection before synthesis, which may modestly improve throughput in molecule discovery workflows. However, the evidence does not show that it automates synthesis steps or improves synthesis reproducibility directly.
Assumptions: Assumes computational triage of compounds is in scope as a partial response to laborious synthesis workflows.
Missing evidence: No direct evidence for synthesis automation, robotic execution, or reaction reproducibility benefits.
Much of the Biosphere Remains Uncharted and Vulnerable to Information Loss
Ecology
0 capabilities · 3 candidate tools
Much of Earth's biosphere—from the deep ocean to atmospheric bioaerosols—remains unexplored, with the microbial majority largely uncharted. Advancing new exploration technologies and systematically cataloging the Earth's microbiome could unlock discoveries of new life forms and biological insights that could impact health, climate, geoengineering, agriculture, and fundamental biology.
Candidate tools
High-throughput screening is an assay method cited in microbial biotechnology literature as part of the CRISPR/Cas toolbox for evaluating variants generated by multiplexed engineering. In the supplied evidence, it is presented as a screening approach associated with CRISPR/Cas-based metabolic engineering and with development of new dynamic systems.
Systematically cataloging large numbers of microbial samples implies a need for scalable assay workflows, and high-throughput screening is one of the few supplied items explicitly aligned to work at scale. It could plausibly support rapid functional triage of many isolates or engineered variants relevant to biosphere exploration, though the evidence is tied to CRISPR/metabolic-engineering contexts rather than environmental discovery directly.
Assumptions: Assumes the gap includes scalable functional characterization as part of cataloging unexplored microbiomes.
Missing evidence: No supplied evidence for use in environmental microbiome discovery, field sampling pipelines, sequencing integration, or uncultured community profiling.
High-resolution live imaging is cited as a methodological approach that provides new opportunities to study branching morphogenesis in living systems. The supplied evidence identifies it only at the level of a general live-imaging assay and does not specify the imaging modality, reporter strategy, or biological model.
Live imaging could help characterize newly discovered organisms or microbial behaviors once samples are obtained, providing direct observational phenotyping. However, the supplied evidence is very general and does not connect this assay to environmental exploration or large-scale cataloging.
Assumptions: Assumes some part of the gap involves downstream phenotypic characterization of newly sampled biology.
Missing evidence: No modality details, no environmental or microbial exploration use case, and no evidence for scalable deployment in biosphere cataloging.
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
Computational modeling could plausibly assist in organizing or interpreting complex biological data generated during exploration efforts. Still, the supplied evidence is centered on synthetic gene circuits and other engineered systems, so its relevance to uncharted biosphere mapping is indirect.
Assumptions: Assumes lightweight computational methods for data interpretation are in scope for the gap.
Missing evidence: No evidence for ecological surveying, microbiome cataloging, biodiversity inference, or environmental sampling analysis.
Challenges in Tracking and Restoring Resilient Ecosystems
Ecology
0 capabilities · 3 candidate tools
Regenerating degraded environments and designing self-sustaining systems require a unified understanding of ecological dynamics. Our current models fall short in predicting complex interactions—such as feedback loops and stability thresholds—that determine ecosystem behavior.
To close this gap, we need better datasets and models of biodiversity and animal movements, as well as tools to predict and contain invasive species. We also need the ability to experiment with restoration strategies, and validate approaches ranging from rewilding to engineering de-extinction technologies.
Candidate tools
Model-informed rational design is an engineering method in synthetic biology that uses models to guide the design of biological systems. In the cited plant context, it has been successfully applied to engineering plant gene regulation and metabolism.
The gap explicitly calls for better predictive models of complex ecological interactions, and this item is a model-guided design method. It could plausibly support structured design and comparison of restoration strategies, although the supplied evidence is from plant synthetic biology rather than ecosystem-scale ecology.
Assumptions: Assumes model-guided design principles can transfer from engineered biological systems to restoration strategy design.
Missing evidence: No direct evidence for ecology, biodiversity modeling, invasive species prediction, or ecosystem restoration use.
Switched differential equations were developed as a computational framework to model oscillatory behavior of circadian clock cells in the Madeira cockroach. The model was used to interpret RNAi perturbation phenotypes and to support a hypothesis of coupled morning and evening oscillators linked by mutual inhibition.
The gap centers on feedback loops, stability thresholds, and ecosystem dynamics, and this item is a dynamical-systems modeling framework for oscillatory behavior with switching and mutual inhibition. It may be adaptable for representing ecological regime shifts or threshold behavior, but the provided evidence is limited to a circadian cell model in cockroach biology.
Assumptions: Assumes switched dynamical-system formalisms may generalize to ecological state-transition modeling.
Missing evidence: No direct evidence for ecosystem, population, biodiversity, movement, or invasive species applications.
Sequencing-based solutions are proposed assay methods for detecting large-scale CRISPR-associated genomic alterations. In the cited review, they are positioned as potential approaches to identify rare events such as translocations, inversions, deletions, and chromothripsis that can be missed by current workflows.
The gap mentions a need for better datasets, and sequencing-based assays can in principle generate high-resolution biological data. However, the supplied evidence is specifically about detecting rare CRISPR-associated genomic alterations, not biodiversity tracking, animal movement, or ecosystem monitoring.
Assumptions: Assumes only a very general relevance through data generation capability.
Missing evidence: No evidence for environmental sampling, species detection, metagenomics, eDNA, field deployment, or restoration monitoring.
Many Molecules Can’t Easily Be Crystallized
Materials Science
0 capabilities · 3 candidate tools
Crystallization is crucial for determining molecular structure, yet many molecules resist forming crystals. Improved computational models of crystal growth are needed to guide experimental efforts.
Candidate tools
Molecular dynamics simulations were used as a computational design method to guide construction of the PiL[D24] photoswitchable mPKM2-LOV2 fusion reported in the 2017 FEBS Journal study. In that context, the simulations supported engineering of a light-responsive pyruvate kinase chimera that preserved LOV2 photoreactivity and showed illumination-dependent changes in enzyme activity.
The gap explicitly asks for improved computational models to guide experiments when crystallization is difficult, and molecular dynamics is a directly relevant structural modeling approach. It could plausibly provide conformational or interaction hypotheses that reduce dependence on crystals, although the supplied evidence is from a light-responsive protein-engineering context rather than crystal growth.
Assumptions: Assumes a general structural-modeling method can still be useful outside the specific optogenetic protein example.
Missing evidence: No supplied evidence that this item models crystallization, crystal growth, nucleation, or non-biological materials molecules.
Transition path sampling is a computational method applied to explicit-solvent molecular dynamics trajectories to extract atomistic features of conformational reaction networks. In the cited study, it was used to analyze the millisecond partial unfolding transition in the light-driven photocycle of photoactive yellow protein and to predict reaction coordinate models and tentative transition states.
Transition path sampling is a computational method for extracting reaction coordinates and transition-state information from molecular dynamics trajectories. That general capability could plausibly inform hard-to-observe structural transitions relevant to crystallization-resistant systems, but the provided evidence is limited to a protein photocycle study.
Assumptions: Assumes transition-state and pathway analysis may transfer to crystallization-related conformational or assembly problems.
Missing evidence: No supplied evidence for use in crystal growth, nucleation, crystallization screening, or materials-science molecules.
Likelihood maximization analysis is a computational method for selecting reaction coordinate models for individual substeps of a conformational transition and inferring tentative transition states. In the cited application, it was applied to transition path sampling data from explicit-solvent molecular dynamics of the millisecond partial unfolding transition in the photoactive yellow protein photocycle.
This method selects reaction-coordinate models and infers tentative transition states from simulation data, which is at least directionally aligned with the gap's need for better computational modeling. It may help analyze structural transitions in systems that are hard to crystallize, but the evidence provided is narrow and not about crystallization itself.
Assumptions: Assumes reaction-coordinate inference is relevant to molecular assembly or conformational transitions that affect crystallizability.
Missing evidence: No supplied evidence connecting this method to crystal formation, crystal growth prediction, or materials applications.
Lack of Direct Measurement of Quantum Effects in Biological Systems
Biophysics
0 capabilities · 3 candidate tools
Despite theoretical predictions, quantum effects in biological systems remain largely unmeasured. Direct experimental evidence is needed to explore how quantum phenomena influence biomolecular interactions.
Candidate tools
Microfluidic single-cell analysis is an assay method used during microfluidic cultivation to quantify growth behavior and expression phenotypes at single-cell resolution. In the cited 2016 E. coli study, it was applied comparatively across PT7lac/LacI, PBAD/AraC, and Pm/XylS expression systems to reveal dynamic and spatiotemporal heterogeneity in recombinant protein production.
This is at least a concrete measurement method with single-cell and spatiotemporal resolution, which could help detect subtle biological signatures if quantum-linked effects manifest as dynamic heterogeneity. However, the supplied evidence does not show that it measures quantum phenomena directly.
Assumptions: Assumes indirect phenotypic readouts could be useful in an early exploratory biophysics workflow.
Missing evidence: No evidence of quantum-sensitive readout, biomolecular coherence measurement, spectroscopy integration, or use in quantum biology contexts.
High-resolution live imaging is cited as a methodological approach that provides new opportunities to study branching morphogenesis in living systems. The supplied evidence identifies it only at the level of a general live-imaging assay and does not specify the imaging modality, reporter strategy, or biological model.
Direct measurement is the core bottleneck, and live imaging is at least an actionable assay category for observing biological dynamics in real time. But the provided summary is too generic and does not indicate any modality capable of resolving quantum effects.
Assumptions: Assumes the gap may benefit from real-time observational assays even if they are not quantum-specific.
Missing evidence: Imaging modality, reporter design, temporal resolution, sensitivity limits, and any demonstrated relevance to quantum effects are not provided.
Mathematical and statistical modelling is a computational design approach used in synthetic biology to improve the predictability of engineered biological systems. In the cited plant synthetic biology literature, it supports model-informed rational design for engineering plant gene regulation and metabolism.
The gap mentions theoretical predictions needing experimental follow-up, and modelling could help formalize hypotheses and design discriminating experiments. Still, this does not itself provide direct measurement of quantum effects.
Assumptions: Assumes experiment-planning support is still relevant to addressing the measurement bottleneck.
Missing evidence: No evidence of quantum-mechanical modelling, quantum-biology application, or linkage to a specific measurement platform.
Light Scattering in Living Tissue Prevents Optical Access to Deeper Regions
Biophysics
0 capabilities · 2 candidate tools
Living tissue exhibits strong light scattering, which hampers deep-tissue imaging and limits resolution. Overcoming this barrier is critical for mapping neural activity and enabling noninvasive diagnostic imaging.
Candidate tools
NIR light-based imaging is an optical assay and photoregulation approach that uses near-infrared light to sense, and in some cases modulate, specific cellular events in living systems. The cited review describes these strategies as enabling real-time interrogation of deep tissues with subcellular accuracy.
This item directly addresses deep-tissue optical access by using near-infrared light, which is presented in the supplied evidence as enabling real-time interrogation of deep tissues with subcellular accuracy. That is a close mechanistic match to the gap's core bottleneck of scattering-limited imaging depth.
Assumptions: Assumes the gap can be addressed by shifting imaging/excitation wavelengths rather than only by tissue clearing or adaptive optics.
Missing evidence: No structured evidence here on exact penetration depth, resolution tradeoffs, or specific tissue types.
Scintillator-mediated optogenetics is an engineering method in which implanted Ce:GAGG microparticles convert X-ray irradiation into scintillation light that activates red-shifted opsins. In mice, this enabled wireless modulation of neural activity at tissue depth, including bidirectional control of midbrain dopamine neurons and associated place preference behavior.
This method bypasses poor direct optical penetration by converting deeply penetrating X-ray input into local scintillation light inside tissue. The supplied evidence specifically supports wireless modulation of neural activity at tissue depth, which is relevant to the gap's deep neural access context.
Assumptions: Assumes partial relevance of deep-tissue actuation strategies to the broader optical-access problem, even though the gap emphasizes imaging as well as mapping.
Missing evidence: No supplied evidence that this method improves deep-tissue imaging or image resolution; evidence is for actuation in vivo after microparticle implantation.
Live Cell Imaging at Deep Nanoscale Resolution is Destructive
Biophysics
0 capabilities · 2 candidate tools
Techniques that achieve deep nanoscale resolution in live cell imaging often destroy the sample, limiting the ability to conduct longitudinal studies on the same specimen.
Candidate tools
Reversible protein highlighting is a light-based live-cell imaging assay that uses a photochromic fluorescent protein to repeatedly highlight, erase, and re-highlight labeled molecules without destructive readout. It was applied to visualize stimulus-dependent, bidirectional nucleocytoplasmic shuttling of extracellular signal-regulated kinase (ERK) across the nuclear envelope.
This method is explicitly described as allowing repeated highlight/erase/re-highlight cycles in live cells without destructive readout, which directly aligns with the gap's need for longitudinal imaging on the same specimen. Its photochromic switching mechanism could reduce the need for one-time destructive observation, although the evidence does not show deep nanoscale performance.
Assumptions: Assumes non-destructive repeated readout is a key bottleneck even if absolute nanoscale resolution is not demonstrated here.
Missing evidence: No supplied evidence on nanoscale resolution, imaging depth, phototoxicity benchmarks, or longitudinal performance over extended time courses.
Lattice lightsheet microscopy (LLSM) is a modified light-sheet imaging platform used for three-dimensional optogenetic activation with subcellular resolution. In the cited 2022 study, it enabled high-spatiotemporal-resolution manipulation of cellular behavior, including membrane ruffling and guided cell migration.
Light-sheet microscopy is plausibly relevant to reducing imaging damage in live specimens, and this item specifically supports high-spatiotemporal-resolution, subcellular live-cell manipulation/imaging. It is one of the few supplied items that points toward an advanced optical platform rather than a generic imaging assay.
Assumptions: Assumes lattice light-sheet implementation may be less destructive than other high-resolution live imaging approaches.
Missing evidence: The supplied summary does not explicitly state reduced phototoxicity, longitudinal same-sample imaging, deep imaging capability, or nanoscale resolution.
Inadequate Interventions for Greenhouse Gas Removal
Geophysics and Climate
0 capabilities · 2 candidate tools
We need more effective approaches to removing greenhouse gases from the atmosphere to mitigate climate impacts. However, challenges remain in harnessing natural carbon removal systems—due to difficulties in accurately measuring their environmental impact—and in reducing methane emissions from sources like the cow rumen. Innovative strategies, including modifying cow microbiomes and deploying scalable measurement and validation platforms, are essential to advance greenhouse gas removal efforts.
See also: https://www.bezosearthfund.org/news-and-insights/bezos-earth-fund-releases-global-roadmap-to-scale-greenhouse-gas-removal-technologies and https://gaps.frontierclimate.com/
Candidate tools
CRISPR-Cas-mediated genome editing is a programmable genome-editing approach discussed here in the context of bacterial systems. The cited review summarizes the main approaches for bacterial CRISPR-Cas editing and the difficulties associated with applying these systems in bacteria.
The gap explicitly mentions modifying cow microbiomes to reduce methane emissions, and bacterial CRISPR-Cas editing is a directly actionable method for engineering microbial strains. It could support intervention development in rumen-associated bacteria, although the supplied evidence does not show rumen use or methane-related targets.
Assumptions: Assumes some relevant methane-associated organisms in the target workflow are bacteria and genetically tractable.
Missing evidence: No supplied evidence for rumen deployment, methanogen targeting, methane-reduction phenotypes, delivery in complex microbiomes, or environmental validation.
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
The gap calls for scalable measurement and validation platforms, and mathematical modeling is an actionable computational method for designing and prioritizing interventions before expensive experiments. It may help narrow candidate microbiome or carbon-removal strategies, but the supplied evidence is generic and not climate-specific.
Assumptions: Assumes computational prioritization is part of the intended intervention or validation workflow.
Missing evidence: No supplied evidence for greenhouse-gas flux modeling, carbon accounting, rumen microbiome modeling, or field-scale measurement/validation.
Our Immune System Can Uniquely Recognize Nearly Any Molecule but We Don’t Know the Recognition Code
Immunology
0 capabilities · 2 candidate tools
A better understanding of how the immune system interacts at the molecular level with threats and triggers is critical. This knowledge would enable the development of predictive tools and technologies to augment immune responses—improving interventions against infections, cancers, and autoimmune disorders.
Candidate tools
Photo-crosslinking in this context is an application of genetic code expansion in Bacillus subtilis that enables light-triggered covalent capture of molecular interactions. The reported system was part of a broader noncanonical amino acid incorporation platform used for photo-crosslinking, click-labelling, and translational titration.
This method can covalently capture molecular interactions, which is directly relevant to mapping immune recognition events at the binding interface level. That could help generate mechanistic data on receptor-ligand contacts needed to infer recognition rules.
Assumptions: Assumes the method could be adapted from the reported bacterial context to immune recognition studies.
Missing evidence: No supplied evidence for use with immune receptors, antigens, antibodies, TCRs, or high-throughput recognition mapping.
Protein design is a computational engineering method discussed in reviews on protein structure prediction and on optogenetic tool development. It is presented as enabling the creation of protein-based tools, including light-responsive optogenetic systems, that can manipulate and monitor cellular activities.
Computational protein design could plausibly support hypothesis generation about molecular recognition determinants and engineered binders. That is relevant to decoding immune recognition, but the supplied evidence is generic and not specific to immune systems or recognition-code discovery.
Assumptions: Assumes computational design workflows could be redirected toward immune receptor or antigen interaction problems.
Missing evidence: No supplied evidence for immune recognition prediction, antibody/TCR specificity modeling, or validated use in immunology.
Risks of Malicious Bioengineering
Biosecurity
0 capabilities · 2 candidate tools
Advances in synthetic biology have unlocked unprecedented innovations, but also raise concerns about the potential for harmful bioengineering. Preventing misuse requires robust screening and control measures around DNA synthesis. Implementation must be coordinated and universal to effectively minimize the risk of malicious actors.
Candidate tools
DNA synthesis is presented as an engineering method that supports the development of new dynamic metabolic engineering systems. In the cited review, advances in DNA synthesis are identified as a factor that will continue to drive innovation in responsive cell factory design.
The gap specifically highlights screening and control measures around DNA synthesis, and this item is directly about DNA synthesis as an actionable engineering workflow. It could plausibly support implementation of synthesis-stage controls, although the supplied evidence does not describe any actual screening, authentication, or biosecurity mechanism.
Assumptions: Assumes synthesis-stage intervention is in scope even though the item is framed as an enabling method rather than a security tool.
Missing evidence: No evidence for sequence screening, customer screening, access control, misuse detection, or coordinated deployment.
Standardisation is an engineering method in synthetic biology in which engineering principles are applied to genetic manipulation workflows. The cited literature states that standardisation, together with key technical advances, enabled major gains in the speed and accuracy of genetic manipulation.
The gap emphasizes coordinated and universal implementation, and standardisation is at least directionally relevant to making workflows more uniform across organizations. It may help operationalize common procedures around synthesis oversight, but the supplied evidence does not tie it to biosecurity controls specifically.
Assumptions: Assumes workflow standardization could extend to security and screening processes.
Missing evidence: No direct evidence for DNA order screening, governance, compliance, threat detection, or biosecurity-specific standards.
Lack of Structure Prediction for Highly Dynamic Proteins
Biophysics
0 capabilities · 2 candidate tools
Current structure prediction tools like AlphaFold excel for stable proteins but struggle with highly dynamic proteins whose structures fluctuate continuously, leaving a gap in our understanding of intrinsically disordered proteins and protein allostery.
Candidate tools
AlphaFold3 is a computational structure-prediction method used in the cited study to model the MagMboI–DNA complex. In that work, it was applied to infer interactions with the 5'-GATC-3' recognition sequence and to guide optimization of the photoactivatable endonuclease variant MagMboI-plus for top-down genome engineering.
This is the only candidate with explicit evidence as a structure-prediction method, so it is at least directly adjacent to the gap. It could serve as a baseline comparator or starting point for dynamic-protein modeling workflows, even though the supplied evidence does not show that it solves highly dynamic or intrinsically disordered cases.
Assumptions: Assumes a baseline structure-prediction tool is still useful for benchmarking this gap.
Missing evidence: No supplied evidence for performance on intrinsically disordered proteins, conformational ensembles, allostery, or highly dynamic proteins.
Protein structure prediction is a computational method for inferring protein three-dimensional structure. In the supplied evidence, it is identified only as a topic covered in a 2019 review on advances in protein structure prediction and protein design.
The item is directly about computational protein structure prediction, which is the problem area named in the gap. However, the supplied evidence is only a general review-level description and does not identify a specific method for dynamic proteins.
Assumptions: Assumes broad structure-prediction methods may be relevant as a category-level starting point.
Missing evidence: No specific algorithm, workflow, benchmark, or evidence for dynamic proteins, disorder, or allosteric ensemble prediction.
Lack of Infrastructure Technologies and Strategies Optimized for Low-Resource Settings
Global Health
0 capabilities · 1 candidate tools
Global health outcomes are compromised by insufficient health systems and infrastructure that limit our ability to prevent and control infectious diseases. Key deficiencies include the lack of cost-effective antimicrobial materials to block pathogen transmission, underdeveloped intervention models for effective public health strategies, and outdated sanitation solutions that fail to meet the needs of vulnerable populations.
Candidate tools
CRISPR-based biosensors are molecular diagnostic constructs that use CRISPR systems for sequence-specific nucleic acid recognition to detect disease-associated targets. A 2023 review presents them as a strategy for detecting emerging infectious diseases.
The gap explicitly includes infectious disease control, and this item is described as a diagnostic strategy for detecting emerging infectious diseases. Sequence-specific nucleic acid detection could support decentralized surveillance or case finding where conventional laboratory infrastructure is limited.
Assumptions: Assumes the gap includes low-resource diagnostics as part of infrastructure needs.
Missing evidence: No supplied evidence on field deployment in low-resource settings, sample prep simplicity, instrument requirements, per-test cost, antimicrobial use, sanitation use, or public-health implementation outcomes.
Current Chip Fabrication Methods are Extremely Expensive and Hard to Change
Nanoscale Fabrication
0 capabilities · 1 candidate tools
Modern chip fabs are enormous, multi-billion-dollar facilities with limited versatility in what they can produce. This bottleneck restricts the ability to create assemblies with diverse molecular components on a small scale.
Candidate tools
Biofunctional nanodot arrays (bNDAs) are nanoscale surface-patterned delivery harnesses designed to spatially control dimerization and clustering of cell-surface receptors. In live cells, they were used to capture extracellularly GFP-tagged Lrp6 and drive assembly of active Wnt signalosomes at the plasma membrane.
This is the only candidate explicitly based on nanoscale surface patterning and spatial organization, which is directly relevant to building small-scale molecular assemblies without relying on full conventional chip-fab workflows. Its use for patterned receptor clustering suggests a plausible route to more reconfigurable, biologically functional nanoscale layouts.
Assumptions: Assumes the gap includes biologically functional nanoscale patterning rather than semiconductor device fabrication specifically.
Missing evidence: No evidence here on fabrication cost reduction, reconfigurability, manufacturing simplicity, or compatibility with non-cell-surface molecular components.
We Can’t Yet Replicate Animal Olfaction Synthetically as a Sensing and Classification Modality
Chemistry
0 capabilities · 1 candidate tools
We currently lack a comprehensive model explaining how biological systems decode and classify chemical signals through olfaction. Understanding this process is critical for applications ranging from flavor science to disease diagnostics to understanding and harnessing animal communication.
Candidate tools
Mathematical modeling is a computational method used to guide the rational design of synthetic gene circuits. The cited literature also places it alongside live-cell imaging and within quantitative model systems used to study microbial drug resistance and spatial-temporal features of cancer in mammalian cells.
The gap explicitly includes lacking a comprehensive model for how olfactory systems decode and classify chemical signals, and this item is directly a computational modeling method. It could plausibly help formalize decoding hypotheses and support synthetic sensor/classifier design, even though the supplied evidence is not olfaction-specific.
Assumptions: Assumes computational modeling is in scope as an actionable method for this gap.
Missing evidence: No supplied evidence that the modeling was applied to olfaction, chemical mixture decoding, receptor ensembles, or synthetic smell systems.
A Limited Set of Rigid Organizational Structures for Organizing and Funding Research Constrains the Forms of R&D That Get Done
Metascience
0 capabilities
This is more of a meta-bottleneck. But scientists are spending a lot of time not doing science, and the institutional structures in which they work are often set up with incentive structures that hinder certain kinds of outcomes, like more coordinated research.
Candidate tools
No candidate tools have been materialized for this gap yet.
AI Could Be Misused
Computation
0 capabilities
The risk of AI being misused—whether through malicious intent or unintended consequences—necessitates robust safeguards and countermeasures.
Candidate tools
No candidate tools have been materialized for this gap yet.
AI Could Go Rogue
Computation
0 capabilities
The potential for AI systems to behave unpredictably or dangerously (“go rogue”) is a critical concern. Ensuring safe and controllable AI architectures is essential for reliable operation.
See also:
• https://www.lesswrong.com/posts/fAW6RXLKTLHC3WXkS/shallow-review-of-technical-ai-safety-2024
• https://deepmind.google/discover/blog/taking-a-responsible-path-to-agi/
Candidate tools
No candidate tools have been materialized for this gap yet.
AI is Still Narrow in its Reasoning and Planning
Computation
0 capabilities
Current AI systems exhibit narrow reasoning and planning capabilities compared to human cognition. Broadening AI training methods to include holistic, brain-inspired architectures and cognitive frameworks can advance general intelligence (flagging that there is an AI safety risk here).
Candidate tools
No candidate tools have been materialized for this gap yet.
Clinical Trials are Inefficient, Slow and Scarce
Physiology and Medicine
0 capabilities
Clinical trial designs are often inefficient, resulting in high costs, lengthy timelines, and suboptimal patient outcomes. Innovative trial designs and decision-support tools are required to streamline the clinical evaluation process and accelerate therapeutic development.
Candidate tools
No candidate tools have been materialized for this gap yet.
Designing Buildings is Hard
Mechanical Engineering
0 capabilities
Architectural design and construction planning are complex and labor-intensive. Advanced computational design and AI-driven optimization have the potential to revolutionize how buildings and construction plans are generated.
Candidate tools
No candidate tools have been materialized for this gap yet.
Doing and publishing research is expensive and subject to structural roadblocks
Metascience
0 capabilities
Traditional structures dominate in how research is conducted and how its outputs are disseminated. Expensive publishing practices restrict and slow the spread of knowledge. We should replace outdated publishing practices and complement research practices with new approaches that leverage frugal innovation, community-led platforms, and open access. We imagine a future where scientific discovery is more inclusive and dynamic.
Candidate tools
No candidate tools have been materialized for this gap yet.
Education Modalities Suffer From Scaling Limitations
Social Science
0 capabilities
Current education systems face structural inefficiencies such as excessive administrative workloads on educators, overcrowded classrooms, and inequitable resource distribution. Innovative technologies have the potential to significantly reduce these burdens by providing tools that assist teachers with scheduling, grading, and creating personalized, adaptive lesson plans. Digital platforms could dynamically tailor learning experiences to individual student progress, complementing classroom teaching. Additionally, technology-driven improvements in administrative efficiency could free valuable resources, enhancing educational equity and overall student experiences.
“US K-12 teachers are 30% more likely to face burnout than U.S. soldiers, whose lives are defined by relentless duty, perpetual war and low wages.” - Adrienne Williams
“Given recent improvements in the quality, affordability, and usability of technologies like AI, computer vision, and AR/VR, we can reimagine a more personalized, research-driven K-12 experience—one better able to meet diverse learning needs and set students up for success both in school and in life. From chatbots for individual coaching to immersive mixed reality solutions to adaptive technologies supporting culturally responsive pedagogy, we have an opportunity to leverage new technologies to tackle pressing needs across literacy education, STEM instruction, and preparing students for the workforce.” - Kumar Garg
Candidate tools
No candidate tools have been materialized for this gap yet.
Ephemeral Societal Data on Proprietary Platforms
Social Science
0 capabilities
Much critical data is stored on proprietary platforms and is at risk of disappearing, hindering long-term research and reproducibility.
Candidate tools
No candidate tools have been materialized for this gap yet.
Fragile Supply Chains and Lack of Backup for Critical Infrastructure
Biosecurity
0 capabilities
Many critical supply chains and infrastructure systems are fragile and lack robust backup mechanisms, leaving society vulnerable.
Candidate tools
No candidate tools have been materialized for this gap yet.
Frontier Telescopes Are Expensive and Take Decades to Build
Astrophysics
0 capabilities
The current model for building space telescopes is cost-prohibitive and slow, often requiring decades of development. New approaches that exploit reduced launch costs and modular assembly are needed to accelerate telescope construction and reduce costs, and there needs to be the organizational structure and hunger to adopt such methods.
Candidate tools
No candidate tools have been materialized for this gap yet.
Higher-Resolution Views of the Universe Are Roadblocked by Formation Flying Technology
Astrophysics
0 capabilities
Space telescopes offer vastly superior sensitivity to ground-based systems, but enhancing their resolution requires spacecraft with sub-micron precision.
Angular resolution of telescopes is limited by the size of the primary optic. Coherent aperture synthesis (interferometry) gets around this by coherently combining signal from separated telescopes where the resolution is proportional to the baseline separation. This has been very successful in the radio (see Event Horizon Telescope) but in the optical regime requires extremely difficult optomechanics and controls, and the sensitivity on the ground is inherently limited by the coherence time of the atmosphere.
Candidate tools
No candidate tools have been materialized for this gap yet.
Inability to Anticipate or Prevent Ecosystem Tipping Points
Ecology
0 capabilities
We lack the models and infrastructure to monitor and predict how ecosystems behave under stress or when they might collapse, as well as what metrics to use to determine that restoration is effective. We need to understand the underlying dynamics, feedback loops, and thresholds that lead to ecosystem degradation/ collapse, as well as to secure essential systems like pollination.
Candidate tools
No candidate tools have been materialized for this gap yet.
Inability to Comprehend and Synthesize the Entire Scientific Literature at Scale
Metascience
0 capabilities
The volume of scientific publications is overwhelming, making it difficult for humans to read, comprehend, and synthesize the entire body of literature. How can AI-generated knowledge become cumulative? What should a machine-human shared Wikipedia look like? We should collect and synthesize all the world’s knowledge, accelerate its development, and make it universally available in a compelling form.
Candidate tools
No candidate tools have been materialized for this gap yet.
Inability to Model Turbulence
Physics
0 capabilities
Modeling turbulence remains one of the most challenging problems in physics due to its nonlinear and chaotic nature.
Candidate tools
No candidate tools have been materialized for this gap yet.
Inadequate Blockers of Transmission
Biosecurity
0 capabilities
Our ability to block the transmission of pathogens is limited. Without effective strategies, airborne and surface-based transmission continues to spread diseases. A meta roadmap is here.
Candidate tools
No candidate tools have been materialized for this gap yet.
Incomplete Resolution of the Possibility of Low-Energy Nuclear Reactions
Physics
0 capabilities
While low-energy nuclear reactions (LENRs) have received substantial attention and there is no good evidence they exist, there may still be other mechanisms or parameter combinations that are underexplored.
Candidate tools
No candidate tools have been materialized for this gap yet.
Insufficient Monitoring and Modeling of Climate Processes and Control Paths
Geophysics and Climate
0 capabilities
We have limited capacity to predict key disruptive events, such as solar flares that threaten power grids and communications, alongside an incomplete understanding of natural processes (atmospheric, ocean, etc.) that underpin climate models. We need better monitoring tools for characterizing phenomena that impact climate dynamics, such as aerosol-cloud interactions, and assessing potential interventions such as marine cloud brightening. These issues underscore the need for enhanced observational tools and more sophisticated models of climate processes.
Candidate tools
No candidate tools have been materialized for this gap yet.
Labor-Replacing AI Could Lead to Human Disempowerment
Social Science
0 capabilities
As AI systems become the cornerstone of competitive advantage, they can inadvertently marginalize human roles and decision-making. The drive for efficiency and cost reduction may lead organizations to rely predominantly on AI, sidelining human judgment, creativity, and accountability. This dynamic risks creating environments where economic and social inequities widen, and the intrinsic value of human input is systematically undermined (see examples). The gradual disempowerment of individuals under such competitive pressures poses significant challenges for societal well-being and democratic governance.
See: https://gradual-disempowerment.ai/
Candidate tools
No candidate tools have been materialized for this gap yet.
Lack of a Dedicated Field for Planetary Terraforming
Space Engineering
0 capabilities
There is currently no established field for systematically studying and applying planetary terraforming methods, leaving key challenges in transport, energy supply, and civil engineering largely unaddressed.
Candidate tools
No candidate tools have been materialized for this gap yet.
Major Planetary Science and Astrobiology Missions Are Not Realized by Existing Government Space Agencies
Astrophysics
0 capabilities
Some of the most important planetary science and astrobiology missions remain unrealized by traditional government agencies like NASA. Alternative, independent initiatives are needed to explore these high-priority scientific questions.
Candidate tools
No candidate tools have been materialized for this gap yet.
Many Methods Are Stuck in 20th Century Fabrication Paradigms
Mechanical Engineering
0 capabilities
Modern manufacturing systems largely rely on paradigms developed in the last century where large machines produce components smaller than themselves. This approach is increasingly limited by scaling challenges and cost inefficiencies. To meet future demands, we need to reimagine manufacturing by developing universal robotic construction systems and low-capital, high-energy manufacturing solutions that leverage emerging technologies such as advanced robotics, precision machining, and renewable energy integration. These innovations could, for example, dramatically lower the cost of machining high-performance materials like titanium or enable widespread automation in sectors like desalination.
Candidate tools
No candidate tools have been materialized for this gap yet.
Our Compute Stack is Insecure but Fundamentally Doesn’t Have To Be
Computation
0 capabilities
Insecure software can lead to vulnerabilities that undermine the reliability and safety of computational systems. Formal methods and rigorous verification are needed to synthesize secure software.
Candidate tools
No candidate tools have been materialized for this gap yet.
Our Immune Memory Contains a Detailed History of Exposures but We Can’t Read It
Immunology
0 capabilities
Immunological diseases often have nonobvious, complex etiologies and pathophysiologies that are difficult to identify.
Candidate tools
No candidate tools have been materialized for this gap yet.
Our Platforms for Civic Engagement and Democratic Decision-Making Don’t Take Advantage of 21st Century Scalable Technology
Social Science
0 capabilities
Our current systems for democratic participation are hindered by outdated platforms and tools that fail to scale with modern needs. Limited survey infrastructure, insecure voting methods, and under-informative deliberative tools restrict our capacity for informed, collective decision-making. By harnessing AI to facilitate clearer expression of public opinion and leveraging innovative technologies for secure, scalable engagement, we can transform civic participation into a more robust, effective, and inclusive process.
Candidate tools
No candidate tools have been materialized for this gap yet.
Outdated Space Station Construction
Space Engineering
0 capabilities
Only one new space station (天宫) has been launched this century, due to high costs and reliance on traditional, government-led megaprojects.
Candidate tools
No candidate tools have been materialized for this gap yet.
Particle Accelerators Are Large and Expensive
Physics
0 capabilities
Traditional particle accelerators are enormous and costly, limiting experimental flexibility. Compact, benchtop accelerators could democratize high-energy physics and open new avenues in applications such as medical isotope production.
Candidate tools
No candidate tools have been materialized for this gap yet.
Policy Creation and Evaluation is Manual and Suffers from Low Efficiency and Accountability
Social Science
0 capabilities
Policy development and evaluation processes today rely heavily on manual human review to ensure accountability. However, as AI systems increasingly support or automate these processes, this human-centered accountability becomes challenging. Human reviewers risk becoming a critical bottleneck, slowing policy implementation. New tools are needed to streamline policy creation and evaluation, and to ensure consistency and compliance before deployment.
Candidate tools
No candidate tools have been materialized for this gap yet.
Proving Math Theorems is Challenging for Both Humans and AI
Computation
0 capabilities
Both human mathematicians and current AI systems struggle with proving complex math theorems. Enhancing theorem proving through interactive and automated methods could push the boundaries of mathematical reasoning.
Candidate tools
No candidate tools have been materialized for this gap yet.
Quantum Gravity is Experimentally Hard to Constrain
Physics
0 capabilities
Quantum gravity remains elusive, with experimental constraints hindered by the need for extremely large-scale or prohibitively expensive experiments.
Candidate tools
No candidate tools have been materialized for this gap yet.
Robot Hardware and Software is Still Clunky
Mechanical Engineering
0 capabilities
Robots have the potential to revolutionize manufacturing, logistics, and many other industries—but only if they are both affordable and capable of high performance. Today’s robotic hardware is often prohibitively expensive and built using legacy designs that do not prioritize cost reduction, modularity, or scalability. Moreover, many robots struggle with dexterity and tactile sensing, and current design practices decouple hardware and software, preventing a co-evolution that could unlock new performance regimes. Overcoming these limitations requires a rethinking of both robot morphology and control, with an emphasis on integrated design, cost-effective production, and enhanced functionality.
Candidate tools
No candidate tools have been materialized for this gap yet.
Robust and Compact Plasma Confinement for Fusion is Still Not Solved
Physics
0 capabilities
Stable plasma confinement is a major obstacle in achieving practical fusion energy. Advanced control systems and novel confinement techniques are needed.
Candidate tools
No candidate tools have been materialized for this gap yet.
Sim-to-Real Transfer for Robots is Hard
Mechanical Engineering
0 capabilities
Bridging the gap between simulated robot behavior and real-world performance remains a significant challenge, particularly for tactile interactions and complex environments.
Candidate tools
No candidate tools have been materialized for this gap yet.
Uncertainty and Noise in the Science of Room-Temperature Superconductivity
Physics
0 capabilities
There remains significant uncertainty over whether metallic hydrogen can exhibit room-temperature superconductivity at reasonable pressures, and measurements of other systems have been irreproducible and fragmented.
Candidate tools
No candidate tools have been materialized for this gap yet.
Underdevelopment of Deep Tooling for Economic Modeling and Future Forecasting
Social Science
0 capabilities
Current economic models are often too simplistic to capture the intricate dynamics of our global economy, limiting effective policy-making and forecasting. Experimentation with innovative economic models—such as those incorporating universal basic income or alternative market systems—is rare, leaving us unprepared for emerging trends. The inherent complexity of global systems further complicates accurate forecasting, underscoring the urgent need for more sophisticated, adaptive tools that can better predict and navigate the economic landscape of tomorrow.
Candidate tools
No candidate tools have been materialized for this gap yet.
Underdevelopment of Modern Tools in the Social Sciences
Social Science
0 capabilities
The social sciences need new tools to help researchers identify and prioritize important questions that will have an impact, and better infrastructure to collect qualitative data. Qualitative methods are powerful for understanding the how and why behind social outcomes, yet even the most comprehensive surveys don’t capture all the factors that contribute to social outcomes. AI-enabled qualitative methods could super-charge the social sciences, but there is much work to be done.
Similarly, many archaeological methods remain manual and lack the technological revolution seen in other fields, limiting discovery and analysis.
Candidate tools
No candidate tools have been materialized for this gap yet.
We Have a Limited Ability to Acquire, Concentrate and Substitute Chemical Elements in Processes
Materials Science
0 capabilities
The cost of materials is often dominated by the cost to obtain their constituent elements. What presents commercially as the “critical minerals problem” masks a larger scientific bottleneck on how we acquire, concentrate, and substitute chemical elements.
Candidate tools
No candidate tools have been materialized for this gap yet.