Challenge Problems

đź’¬ Participatory Democracy

Bot Detection on Social Media
We have historical datasets from previous studies on Russian Telegram bots operating in Ukraine and Moldova (e.g. the 2025 parliamentary election). These datasets include all comments and accounts already flagged as bots. Currently, bot detection is semi-automated and still requires manual verification of each account—a process that doesn’t scale. The mission is to develop or enhance an algorithm for automatic bot detection that meets the following goals: 1) Reduce manual intervention – Minimize or eliminate the need for human verification; 2) Early detection – Identify bots after fewer comments are posted; 3) Adaptability Handle evolving bot strategies, including AI-generated, unique (non-duplicated) comments.

Mentor(s): Yuliia Dukach (OpenMinds)

Building an Early Disinfo Signal Detection Pipeline from Fake Account Markets
Fake social media accounts—often powered by bot armies—are a cornerstone of online disinformation. The Cambridge Online Trust and Safety Index (COTSI) currently tracks the daily price and availability of fake SMS verifications across countries for 500+ platforms. This data provides insight into one critical step in fake account creation. SMS verification is only part of the fake account ecosystem. Many other factors influence the cost and accessibility of fake accounts. The next evolution of COTSI is to track fake account prices across platforms on a daily basis, enabling deeper analysis of what drives these prices—and how to raise them to deter disinformation and scams. The mission baseline is to develop a data collection pipeline that: 1) Scrapes price and availability data from at least 4 online fake account markets (more is better); 2) Runs as robust, automated scripts (e.g., cron jobs) to collect time-series data; 3) Stores data in Firebase collections, ready for integration with existing COTSI APIs.

Mentor(s): Jon Roozenbeek (Cambridge University)

Baseline Exploration for Bias Detection using Collaborative Knowledge Bases
Platforms like Wikipedia and Wikidata are open, collaboratively curated repositories that aim to represent “the sum of all knowledge.” These sources evolve continuously and reflect collective editorial decisions. Could they also serve as reliable foundations for detecting narrative leanings and positioning biases in public discourse? Narratives around contentious issues—such as geopolitical conflicts or policy debates—are shaped by framing choices, references, and editorial support. Publicly curated data (e.g., who supports certain formulations, what sources are cited) may contain signals of bias. The challenge is to determine whether these signals can be systematically extracted and analyzed to characterize stance and narrative framing. The Mission is to design and implement methods to: 1) Evaluate the stability and reliability of these sources for bias-sensitive applications—can they support stance detection without frequent retraining?; 2) Extract and compare narrative framings from Wikipedia and Wikidata entries on selected contentious topics; 3) Apply AI/ML techniques to analyze these framings and assess their relationship to public discourse (e.g., news or social media narratives).

Mentor(s): Mykola Trokhymovych (Google Fellow, Universitat Pompeu Fabra)

Narrative Tracking Across Contexts
False or misleading narratives about the Russia–Ukraine conflict often transcend linguistic and national boundaries. Claims originating in Russian-language media or Telegram channels frequently reappear—sometimes reframed or translated—across Western news and social media platforms within days. Understanding this migration is key to combating disinformation. How do narratives move between Russian and English-language ecosystems? What patterns emerge in their timing, amplification, and transformation? Mapping these flows can reveal vulnerabilities in the information space and inform resilience strategies. The mission is to design and implement a data-driven narrative-tracing pipeline that: 1) Produces an interactive visualization (network map or timeline) showing cross-lingual information flows: 2) Collects and curates multilingual datasets from open sources (e.g., EUvsDisinfo, GDELT, ConflictMisinfo, MediaCloud); 3) Applies NLP and temporal analysis to identify overlaps and timing correlations between narratives.

Mentor(s): IN2

🛡️ Civil Defense

Drone Detection – Acoustic
Drones flown by wire through fiber optics operate silently in terms of radio emissions, making traditional RF-based detection ineffective, posing significant security risks. Acoustic detection offers a promising alternative by leveraging the unique sound signatures of drone propellers. However, this approach faces challenges such as environmental noise interference (wind, rain, traffic) and the need for accurate direction finding in real-time. The mission is to develop and demonstrate a working prototype of an autonomous device that uses an array of microphones (an acoustic array) to detect unmanned aerial vehicles (UAVs), filter out environmental noise, and determine the direction (bearing) of the sound source.

Mentor(s): Sebastian Monzon, Skyfall, Mathworks

Vital Signal Detection for Search & Rescue
In extreme conditions—such as search and rescue missions, long-distance hikes, or conflict zones—delays in medical assistance due to lack of real-time health and location data can be fatal. Conventional mobile networks often fail in these environments, leaving teams without reliable communication. Develop a fully functional, energy-efficient remote personal medical monitoring system that uses the LoRa protocol to transmit critical data—vital signs and geolocation—in areas with limited or no mobile network coverage. The solution should prioritize reliability, low power consumption, and robust performance under harsh conditions.

Mentor(s): Sebastian Monzon, Skyfall

Interfaces to Help Detect Unexploded Remnants of War
Current metal-detection systems, which often include ground-penetrating radar (GPR), wire detectors, and carbon-rod sensors in addition to metal detection, typically output only simple audio beeps, making them difficult to use effectively without extensive training. This challenge is to develop a more user-friendly user interface that translates these multi-sensor signals into intuitive visual or tactile feedback. Teams can work with a variety of devices that we have access to—professional-grade systems like Ceia, Mine Hound, and AN/PSS-14, all of which output audio as their primary signal, as well as lower-cost, hobbyist-grade detectors to build a prototype. We expect that a refined, modern interface will benefit both highly experienced bomb-clearance professionals and newcomers alike, enabling faster learning, better accuracy, and safer operations.

Mentor(s): Paul Martin (Massachusetts Army National Guard)

Sparse Image Stitching for Finding Unexploded Remnants of War
In environments with few visual features—such as open fields, deserts, or snow-covered terrain—traditional drone-image stitching tools often fail to produce accurate orthomosaics. These failures can hinder operator interpretation and, in some cases, obscure critical details needed for assessing risks and hazards. This project challenges teams to build a simple, open-source system capable of ingesting drone flight data and reliably stitching images in feature-sparse conditions, with or without GPS or RTK inputs. Example datasets that current tools like Pix4D struggle with will be provided, along with connections to operational partners such as HALO Trust and others.

Mentor: Dmitro Martynowych (MineSight)

✊🏾 Societal Resilience

Engineering Education Tools Development

Many Ukrainian children have not attended school in person since pre-pandemic times and most do not have enough opportunities for hands-on learning. In Kharkiv, they have access to the Natural History Museum run by Kharkiv National University. There is a need to develop educational tools to help children get inspired by biomimetic engineering examples and learn design and engineering skills crucial for Ukraine reconstruction.

Mentor:  Valerie Tolstorukov, Rostislav Lunyachek, Nature Museum of Kharkiv National University

Energy Resilience

Next-generation modular nuclear power plants are essential for ensuring energy security in Ukraine and worldwide. Their successful development, scale-up, and safe operation require the creation of advanced materials as well as integrated hardware and software tools for in-situ monitoring of plant performance. The shortage of qualified early-career personnel and students presents a growing challenge to achieving these goals. However, Ukraine’s network of legacy high-energy-physics installations and research facilities offers a unique opportunity to test new materials and algorithms, while simultaneously developing training tools for the next generation of researchers—both human and AI-based agents.

Sub-Challenge 1: Resilient radiation-hardened materials for extreme environments
The goal is to train an AI co-pilot to accelerate simulations of the interactions of multi-particle ionizing radiation with structural and photonic materials of complex chemical compositions and geometrical form factors. This approach will advance the design of radiation-hardened, self-powered sensors and protective shielding for both humans and electronic systems.

Sub-Challenge 2: Acoustic data analysis for remote monitoring 
The goal is to develop computational tools for remote monitoring of cooling system performance in nuclear reactors and power plants that utilize supercritical water as a coolant. Acoustic signals collected during system operation under ionizing radiation contain distinct spectral signatures associated with water undergoing phase transitions to and from the supercritical state. A signal-processing pipeline must be developed to classify these signatures and distinguish them from background noise, enabling the creation of automated systems for continuous remote monitoring. Additionally, the same analytical framework can be adapted for acoustic detection of drones, addressing a key civil defense challenge.

Mentor: Svetlana Boriskina (MIT), Sebastian Monzon, MathWorks

Remote Medical Training Assessment

Medical education requires hands-on practice and expert observation to ensure students develop proper clinical skills. In Ukraine, the war has severely disrupted bedside teaching and practical training with frequent air raids, power blackouts, and facility losses forcing many programs online. International expert educators are difficult to bring in sufficient quantities to Ukrainian training locations, creating a critical bottleneck in maintaining education quality.

Remote observation and assessment could help address this gap. This challenge asks participants to develop protocols and tools that enable expert practitioners—including international specialists—to effectively observe, evaluate, and provide feedback on medical training remotely, helping maintain educational standards despite the constraints of wartime conditions.

Mentor: Nelya Melnitchouk (Brigham and Women’s Hospital, Global Medical Knowledge Alliance)

Automatic Medical Text Translation

Many medical textbooks and courses are in the process of being translated and adapted for use on and near the front line. Translation of some texts can be done at a coarse level automatically, but some items, such as specific medical terminology or drug names require manual intervention to correctly present. Feedback from the front line can also affect how things are translated or presented for particular groups. The general steps required for translation include initial translation, scientific editing, and literary editing. A similar process is required for video dubbing.

This challenge seeks to develop and refine an automatic translation and review pipeline from English to Ukrainian, focusing on trauma care courses. This process will involve identifying which elements can be automated, which elements require review, and how to best allocate the review work. Initial work will be with courses provided by the American College of Surgeons, and with medical professionals from the American Ukrainian Medical Foundation and Global Medical Knowledge Alliance.

Mentor: Sergiy Nesterenko (UAMF), Nelya Melnitchouk (BWH, GMKA), Olga Maihutiak (GMKA), Sofia Lipkevych (MIT)