DARPA releases ECOLE BAA
On September 15, the Defense Advanced Research Projects Agency (DARPA) released the Environment-driven Conceptual Learning (ECOLE) broad agency announcement (BAA). Proposals are due by 12:00 p.m. Eastern on November 14.
DARPA is soliciting innovative proposals in the following areas of interest: human language technology, computer vision, artificial intelligence (AI), reasoning, and human-computer interaction. Proposed research should investigate innovative approaches that enable revolutionary advances in science, devices, or systems. Specifically excluded is research that primarily results in evolutionary improvements to the existing state of practice.
The United States Department of Defense (DoD) and Intelligence Community (IC) need computational systems that can robustly and automatically analyze large amounts of multimodal data. Furthermore, these computational systems need to be able to communicate and cooperate with human beings to resolve ambiguities and improve performance over time. The Environment-driven Conceptual Learning (ECOLE) Program will create AI agents capable of continually learning from linguistic and visual input to enable human-machine collaborative analysis of image, video, and multimedia documents during time-sensitive, mission-critical DoD analytic tasks, where reliability and robustness are essential.
ECOLE will transform current Machine Learning approaches by developing algorithms that can identify, represent, and ground the attributes that form the symbolic and contextual model for a particular object or activity through interactive learning with a human analyst. Knowledge of attributes and affordances, learned dynamically from data encountered within an analytic workflow, will enable joint reasoning with a human partner. This acquired knowledge will also enable the machine to recognize when an observed object or activity is novel, rather than misclassifying the newly observed objection or action as a member of a previously learned class, and readily learn a new symbolic representation through interaction with its human partner. Attribute-informed novelty detection will also enable the machine to detect changes in known objects and report these changes when they are significant.
ECOLE representations (“concepts”) must capture both the attributes (essential and incidental characteristics) and affordances (capabilities) of objects, as well as the component primitive actions and associated activity participants. For objects, these learned features will enable analysis of both physical properties and the potential interactions between this object and the environment, including surrounding objects. For activities, the learned features will enable both analysis of component sequences of actions and prediction of associated actions and participants in a partially observed scenario. In each case, achieving the requisite representations will involve using symbolic knowledge.
To achieve an extensible framework capable of instantiating an arbitrarily large (i.e., global-scale) web of knowledge representations, ECOLE systems will be required to learn symbolic object and activity representations, including the component properties and affordances, directly from unlabeled multimodal data. ECOLE systems will also need to be able to refine these representations through collaboration with a human partner. System interaction with human analysts is expected to be symbiotic, with the systems augmenting human cognitive capabilities while simultaneously seeking instruction and correction to achieve accuracy. With respect to capability development, ECOLE will structure the knowledge acquisition process as a form of distributed curriculum learning, with systems building increasingly complex and nuanced understandings of objects, activities, and their interrelationships via a combination of self-directed and human-informed, semi-structured learning from a set of diverse human experts.
The right opportunity can be worth millions. Don’t miss out on the latest IC-focused RFI, BAA, industry day, and RFP information – subscribe to IC News today.