
Research & Analysis Bootcamp for Emergency Managers at the 2026 CESA Conference
Posted May 6, 2026
On Monday, May 4th, the California Resiliency Alliance presented a pre-conference workshop at the California Emergency Services Association’s (CESA) annual conference in San Diego. The session drew practitioners from across the state for an exploration of one of the more underemphasized skill sets in the profession: research and analysis.
The workshop opened with a reframe that set the tone for everything that followed. Emergency managers are, at their core, knowledge workers. The actual work of the profession — reading forecasts, synthesizing incident reports, advising elected officials under pressure, interpreting conflicting data, writing plans that require evidence — is fundamentally cognitive. Yet most professional training is built on a first-responder foundation and has never fully departed from it. ICS, NIMS, and incident management, while important, are process frameworks, not analytical ones. Exercises test whether people know their roles in a pre-defined structure, not whether they can correctly interpret an ambiguous information environment. The cognitive skills that distinguish a merely adequate emergency manager from an exceptional one — asking the right question, challenging assumptions, synthesizing conflicting information — are often largely absent from formal curricula. This workshop dove into the research and analysis component of being an EM knowledge worker.
The workshop covered three primary areas
- Changing Information Landscape
- Fundamentals of Research and Information Analysis
- Information Sources + Tips & Tricks
Part One: The Changing Information Landscape
The first section examined how dramatically the information environment has shifted in the past decade+, and why that shift matters for practitioners. The session framed the change across four dimensions: the sheer volume of data and information now being generated, which exceeds what any team can process even with technology assistance; the velocity at which information now travels, consistently outpacing verification; the variety of sources that simply did not exist previously — drone feeds, ADS-B trackers, IoT building sensors, crowdsourced road data; and the growing challenge of veracity, as sophisticated fabrications, repurposed content, and inaccuracies have become endemic across information channels.
The workshop also examined what is genuinely new in the ecosystem: new types of automated and sensor data streams; a news media environment structurally reshaped by 24-hour cycles, audience fragmentation, and engagement-driven algorithms; growing prevalence of social media and open-source intelligence sources; and AI-generated content as a current operational challenge, not just a future one.
It was highlighted that most intelligence failures are caused by failures of analysis, not failures of collection. Critically, this failure is not limited to frontline analysts — it spans the entire information chain, from the model and algorithm designers shaping the data, to the analysts interpreting it, to the decision-makers acting on it.
The paradox of the modern information environment is that having more data does not automatically produce better decisions — it often makes decision-making harder as it creates more noise. The answer is not more information. It is better analysis of what is already available strategically augmented with additional data and information.
Part Two: Fundamentals of Research & Information Analysis
The second section covered some analytical thinking skills that rarely appear in EM training. It opened with the statement that “The right answer to the wrong question is more dangerous than the wrong answer to the right question.” There is an asymmetry worth noting: the right answer to the wrong question tends to generate confirming feedback — decisions appear to work, stakeholders seem satisfied, and the underlying misdirection goes undetected until the consequences compound. The wrong answer to the right question, by contrast, tends to surface its own error relatively quickly, because the feedback loops highlight the presence of an error. This makes question formulation not just an analytical nicety, but a form of risk management.
We also looked at three question drivers: What decision is this supporting? What degree of precision is needed? What are associated keywords (or phrases), including those others might use?
From there the workshop addressed assumption testing and Expert Intuition – When It Works and When It Doesn’t.
We then dove into analysis pitfalls to watch out for.
-
- Hidden assumptions
- How source material can influence a narrative
- Anscombe’s Quartet — a classic illustration of how four datasets with identical summary statistics can look completely different when visualized, underscoring why numbers alone can mask what the data is actually showing.
- Survivorship Bias – the fact that data sets we use are often already “pre-filtered” and can skew the results
- Data confidence
- And other pitfalls including: confirmation bias, anchoring, availability heuristics, groupthink, scope insensitivity, and the narrative trap.
Part Three: Information Sources and Practical Research Skills
The majority of the workshop was spent in applied territory — working through the actual sources, tools, and techniques practitioners can use.
This section began with a framework for understanding source types: the distinctions between primary, secondary, tertiary sources and how the same source can fall into different categories depending on the research question. The rising prevalence of synthetic data, machine-generated datasets produced by AI or modeling systems to simulate real-world conditions, was also discussed.
Following a short discussion on AI, we moved into looking at simple search query operators and then moved on to doing image searches, using Google Scholar, and leveraging the Internet Archive’s Wayback Machine. Public libraries were discussed as an often-underutilized reference source providing access to a variety of online databases. We then dove into a number of different sample sources listed below and the session closed with a live, unscripted research walkthrough of questions posed by participants.
The session closed with some live, unscripted research walkthroughs.
Links Shared
The links below were shared during the workshop to illustrate the range, variety, and depth of information sources available to emergency management practitioners. They span general research tools, federal and state data sources, private sector resources, and hazard-specific databases. This list is not intended to be comprehensive — it reflects the sources used during the session to illustrate concepts and demonstrate search techniques, and should be treated as a starting point rather than a complete reference library.
Internet Archive WayBack Machine: A digital archive of the World Wide Web that allows users to view archived snapshots of websites as they appeared on specific dates in the past — particularly valuable for recovering federal data, reports, and tools that have been removed or altered on official government websites.
Google Scholar: A freely accessible search engine indexing peer-reviewed academic papers, theses, technical reports, court opinions, and books — the most accessible entry point to the academic and research literature for practitioners without institutional library access.
WorldCat: A global library catalog aggregating the collections of thousands of libraries worldwide, allowing users to search for books, reports, and other materials and identify which libraries — including local public libraries — hold them.
Security Exchange Commission’s EDGAR Database Search: The SEC’s Electronic Data Gathering, Analysis, and Retrieval (EDGAR) system provides free public access to corporate filings submitted to the SEC. For emergency managers, EDGAR can be a valuable and underutilized source of information on private sector entities operating critical infrastructure in their jurisdictions. Annual reports (10-K filings) contain detailed information on a company’s operations, facilities, geographic footprint, known risks, supply chain dependencies, and business continuity disclosures.
Federal Register: The official daily journal of the federal government, publishing proposed and final rules, executive orders, agency notices, and public meeting announcements.
Private Sector Pages – Many private sector entities maintain dedicated service alert or emergency information pages that can provide valuable situational awareness during activations — particularly for supply chain, logistics, and critical commodity disruptions. A few examples shared during the workshop:
American Logistics Aid Network
Walmart Emergency Management Hub
California Energy Commission (CEC) – The CEC is California’s primary state energy policy and planning agency, responsible for forecasting energy supply and demand, advancing energy efficiency and clean energy, and maintaining data on the state’s energy infrastructure and fuel supply. The following dashboards and datasets from the CEC were highlighted during the workshop:
Refinery Inputs and Production Dashboard: Weekly summary of inputs and refined petroleum product production statewide, including crude oil, gasoline, diesel, and jet fuel. This dashboard no longer separates Northern and Southern California regions.
Refinery Stocks: Weekly summary of refined petroleum product stocks statewide, including crude oil, gasoline, diesel, and jet fuel. This dashboard no longer separates Northern and Southern California regions.
California Retail Fuel Outlet Annual Reporting (CEC-A15) Results: Data collected under the Petroleum Industry Information Reporting Act (PIIRA), which requires all retail transportation fueling stations in California to report annual retail sales of gasoline, diesel, and other transportation fuels to the CEC.
California Electricity Consumption Dashboard: Statewide electricity consumption explorable by sector, agency, and county at the monthly and annual level.
Natural Gas Consumption Dashboard: Statewide natural gas consumption explorable by sector, agency, and county at the monthly and annual level.
Open Data: A collection of geospatial data published by the California Energy Commission.
US Energy Information Administration (EIA) – The EIA is the federal government’s primary source of energy data, analysis, and forecasting, covering production, consumption, supply, prices, and infrastructure across all energy sectors. Key resources highlighted during the workshop:
EIA Homepage: Entry point for the full range of EIA data, reports, and analysis.
Short-Term Energy Outlook: Monthly report providing near-term forecasts for energy markets including oil, natural gas, electricity, and renewables — useful for anticipating supply and price trends.
Weekly Petroleum Status Report: Weekly snapshot of U.S. petroleum supply, including crude oil and refined product inventories, imports, and refinery operations.
Petroleum Supply Monthly: A more detailed monthly accounting of petroleum supply movements, production, imports, exports, and stocks by region.
PAD District Imports by Country of Origin: Tracks petroleum imports into each of the five Petroleum Administration for Defense (PAD) Districts by originating country, providing insight into regional supply chain dependencies and vulnerability to international disruptions.
FERC Staff Reports and Papers – Energy Infrastructure: Monthly updates from the Federal Energy Regulatory Commission on new energy infrastructure that has come online across the United States. Note there is a several month delay between then month when energy infrastructure comes online and that month’s report is released.
International Energy Agency: An intergovernmental organization providing authoritative analysis, data, and policy recommendations on global energy. The IEA publishes flagship reports on oil, gas, electricity, and energy security that are useful for understanding the international supply dynamics that ultimately affect domestic fuel availability.
Demographics
US Census Data: The foundational source for demographic data in the United States, serving as the basis for population estimates, community profiles, and vulnerability assessments used across dozens of downstream platforms and planning tools.
California Department of Finance – Demographic Research Unit: Designated as the single official source of demographic data for California state planning and budgeting, providing population estimates and projections at the state, county, and city level.
Earthquakes
USGS Earthquake Catalogue: Searchable database of earthquake events recorded by the U.S. Geological Survey, allowing users to query by location, date range, magnitude, and depth.
USGS Earthquake Scenario Catalogue: An interactive tool for browsing modeled earthquake scenarios, viewing metadata, and accessing scenario event pages — useful for planning, exercises, and understanding potential impact footprints from credible future events in a given region.
Geotechnical Extreme Events Reconnaissance (GEER) Reports: A collection of field reconnaissance reports produced by a volunteer organization of geotechnical engineers, engineering geologists, and earth scientists from academia, industry, government, and nonprofit organizations. GEER teams respond to geotechnical extreme events worldwide, conducting detailed on-the-ground documentation to advance research and improve engineering practice. Reports are particularly valuable for understanding soil behavior, slope stability, and infrastructure performance in post-earthquake and post-landslide environments.
California Geological Survey (CGS) Library: A public reference library holding over 100,000 books, reports, maps, photographs, USGS publications, and periodicals, including the core collection of CGS publications. Open to the public for reference and research.
California Geological Survey (CGS) Maps: A curated index of the most widely used CGS map products, organized by theme — covering fault zones, landslide hazards, liquefaction susceptibility, tsunami inundation, and other geologic hazards relevant to EM planning.
Additional sites shared during the live, unscripted research walk-throughs
California Open Data: The State of California’s centralized open data platform, aggregating publicly available datasets from dozens of state agencies across topics including public health, transportation, environment, housing, economics, and emergency management — a useful first stop when looking for state-level data before drilling into individual agency sources.
California Department of Finance- Languages of the Limited English Proficient (LEP) Population: An interactive map showing the leading languages spoken by Limited English Proficient populations across California, explorable by Public Use Microdata Area (PUMA), Census Tract, and ZIP Code — directly applicable to emergency public information planning, evacuation messaging, and identifying communities that may require multilingual outreach during an activation.
California Immigrant Data Portal – Languages Spoken: A data tool from the California Immigrant Data Portal providing county-level breakdowns of languages spoken by immigrant populations — useful alongside the LEP map for understanding the linguistic diversity of a jurisdiction and informing targeted communication strategies for populations that may have limited exposure to English-language emergency alerts and official channels.
Disability Statistics – Disability Prevalence by State and County in the United States: An interactive tool drawing on American Community Survey data to provide disability prevalence estimates at the state and county level across multiple disability types — a practical resource for understanding the scale of access and functional needs populations within a jurisdiction and informing shelter planning, evacuation support, and resource allocation decisions.
