As part of the Defense Threat
Reduction Agency (DTRA) Integrated Early Warning (IEW) Advanced Technology
Demonstration (ATD), GSE helped execute the PERCEPTIVE DRAGON 3 (PD3) Technical
at Marine Corps Base Quantico, Virginia from September 16-19. The goal of
PERCEPTIVE DRAGON 3 was to demonstrate future concepts for chemical,
biological, radiological and nuclear (CBRN) threat assessment that support
Marine Air-Ground Task Force (MAGTF) amphibious and Air Force expeditionary
operations. The demonstration was focused on the utility and feasibility of using
novel CBRN detection and characterization capabilities, employing manned and
unmanned aerial and ground systems to conduct reconnaissance for CBRN threats,
providing awareness to tactical commanders through existing Marine and Air
Force Command and Control systems and ultimately supporting rapid
decision-making. The PD3 demonstration was the culminating integration event
for the IEW program’s technology development in FY19.
Planning for this year’s event began almost immediately after the completion of last year’s PERCEPTIVE DRAGON 2 (PD2) Technical Demonstration. The planning phase involved coordinating with our USMC and USAF stakeholders to initially develop goals and objectives in an interactive process with our technical lead, the US Army Combat Capabilities Development Center, Chemical Biological Center, Edgewood. GSE’s role in this process was to facilitate the dialog between our technical lead and the warfighter to help the warfighter understand what was possible from a technical standpoint and to help the technical team understand the requirements of the warfighters.
The PD3 technical team conducted multiple trials of each of four
scenario phases. Each trial included: (1) a pre-mission brief; (2) a run-through of
a scripted schedule of events; and, (3) a post-run hot wash session to assess
performance and gather operator feedback. There were no releases of agent simulants
during this demonstration. Alternatively, an emulation environment within the
technology team’s White Cell was incorporated that included virtual enemy
projectile points of impact and alerts, CBRN sensor arrays and associated data,
virtual dismounted infantry Marine movement, as well as other intelligence
feeds and data that complemented other actual data feeds from live sensors. The
combination of real and emulated data feeds provided an
efficient and realistic operational context for data and information flow into
the Marine and Air Force command and control/common operational picture
Marine and Air Force service personnel served as role players in the various command centers and C2 systems. About 25 Marine and Air Force Emergency Management/CBRN specialists participated in various roles and operated forward-positioned technologies and the C2/COP systems at various command echelons. The C2 laydown included Marine Battalion and Regimental Combat Operations Centers, Air Force Operations, and Marine Headquarters and Alpha Companies. Dismounted ECBRN reconnaissance and surveillance (R&S) teams with handheld COPs were also utilized. These participants, along with other operational stakeholders and subject matter experts who observed the technical demonstration, provided comments on the utility of proposed solutions and made recommendations for continued technology development. Together, their valued input assists the ATD development team toward continued IEW mission focus and resource use in the immediate future.
In Phucket, Thailand, BOHRN (Bat One Health Research
Network) convened to discuss solutions for creating improved health security,
research collaboration and advancement in the study of bats and bat borne illnesses.
In an effort led by The Defense Threat Reduction Agency (DTRA), BOHRN has
worked to assemble the top researchers in multiple fields with the hopes of
creating a multi-disciplined baseline for research. Animal/human spillover is
an extremely important issue to the health and security of us all. Scientists
believe the first cases of the 2014 West Africa Ebola outbreak originated in children
playing in a bat roost in Guinea. The overall interactions of humans and bats is
one that continues to be the forefront of emerging diseases. To fully
understand the relationship of humans and bats and how our worlds collide, it
takes the combined efforts of epidemiologists, virologists, taxonomists and a
host of other experts in their fields coming together to speak a common
language in the interest of scientific advancement.
The meeting was held in Phucket, Thailand an area that is
very familiar in its habitats for bats and bat research. The overall theme of
the conference was to discuss the research data and what is the best way to
standardize the language between disciplines. It is the beginning of a greater
discussion into figuring out how top minds in different fields can share data
in a digestible and understandable format, one that creates effective and
immediate results. While the study of different species of bats has been around
for a while there are many different challenges, intergovernmentally, globally
and scientifically, that has prevented a free flow of knowledge across the
field. BOHRN is determined to start addressing these challenges and come up
with solutions that enhance overall impact of One Health networks in the
interest of globalized science and research in an ever-changing and fascinating
Infectious diseases remain a
leading cause of human deaths world-wide, with a particularly high burden in
low-income countries. There have been considerable improvements in human health
and clinical care in recent decades. However, surging population growth,
subsistence farming and persistent overdevelopment of land in lower income
countries has pushed people and farm animals into previously unforged contact
with wild animals and ecosystems, resulting in increased occurrences and
threats of infections with zoonotic pathogens such as avian influenza virus (AIV).
Additionally, the misappropriation and lack of regulations in these locations pertaining
to antibiotic-use in farming and human health sectors has led to the proliferation
of antimicrobial resistant (AMR) strains of bacteria which are significant
threats to global health security. These emerging infectious diseases (EIDs)
occur in “hotspots” across the globe, such as South and Southeast Asia, West
and Central Africa and the Middle East.
Past efforts to detect and reduce
the impacts of EIDs have largely been initiated by, and focused on, post-emergence
outbreak control and mitigation strategies. However, delays in detection or response
to EIDs have caused extensive mortality and high economic damages across
cultural, political and national boundaries. Since most of the developing world
relies on agricultural labor for both health and economic stability, setbacks
caused by EIDs are likely to have lasting impacts relative to high-income countries.
Therefore, there is a need to develop “early warning” and predictive systems in
a local context; that can quickly identify risks of disease emergence in regions
from which novel diseases are more likely to emerge. Such predictive systems are
essential for focusing surveillance, prevention and control programs early in
the chain of emergence, thus containing EIDs closer to their source, and more
effectively limiting their subsequent spread and socioeconomic impacts. In economic
terms, the value of early detection and prevention of EIDs in the labor force
are higher in poorer countries. Heightened awareness of AMFR prevalence is also
essential to preserve antibiotics for future generations and to understand how
to treat common diseases in AMR endemic areas.
Highly pathogenic avian influenza
virus (AIV) is a significant concern because it continues to alter and adapt at
a rapid rate. However, pandemic implications/effects of these tropisms are
unknown. Therefore, a coordinated, well-defined and connected surveillance
effort should establish itself, networking and integrating with previously
established and connected formal and informal global human influenza surveillance
networks. Such an AIV surveillance network could potentially be used as an “early
warning” system, a canary in the coalmine, for other EIDs in both humans and
animals; which to date is a potential that has not been fully realized. The Consortium
of Animal Market Networks to Assess Risk of Emerging Infectious Diseases
Through Enhanced Surveillance (CANARIES) is sponsored by the Defense Threat
Reduction Agency, Cooperative Threat Reduction, Biological Threat Reduction
Program (BTRP). It seeks to apply a multisectoral, multi-level approach,
integrating program policies, legislation and research to achieve better One
Health outcomes. Further, this network aims to complement existing early
warning activities and directly influence surveillance, prevention and control
programs that would have both health and economic impacts worldwide.
“GSE is an SBA HUBZone certified small business” are the first few words typically used to describe Global Systems Engineering (GSE). But, what does that mean?
In 1997, the Small Business Reauthorization Act ratified the HUBZone program into law which was then implemented by the Small Business Administration (SBA). The SBA continues to regulate many contracting assistance programs to help small businesses win federal contracts, including the HUBZone program. A Historically Underutilized Business Zone (HUBZone) is a distressed community that typically has a low median household income and/or a high unemployment rate.
As of right now, HUBZone small businesses are required to locate their principle office in a HUBZone, be a small business, and be at least 51% owned by U.S. citizens.
By locating their headquarters in a HUBZone, businesses are creating a more sustainable community by reinvesting in their neighborhoods, in turn, spurring economic growth. Syntelligent Analytic Solutions is a HUBZone company from Page County, Virginia that was certified in 2012 with 13 employees. As of 2017, the company had employed 70 individuals, created 27 new jobs in the HUBZone, and spent close to $500,000 in their community annually. That’s just one example of a successful HUBZone company and the positive impact that it has on the local community. Although change doesn’t happen overnight, HUBZone companies are contributing to improving their neighborhood’s economy in the long run.
Another requirement for the HUBZone program is to have at least 35% of employees live in a HUBZone. This creates jobs and increases employment in areas that may have a high unemployment rate. GSE optimizes this opportunity by reaching out to college students that live in HUBZones and offering them paid internships that can be worked remotely. This provides jobs for college students in need of employment as well as exposure to a positive work environment, which opens many doors post-graduation. By participating in this program, GSE is not only supporting the Alexandria area that is local to headquarters but is also supporting various HUBZones in numerous states.
GSE values giving back to your community. Our status as a HUBZone company is just one way that GSE showcases our dedication to bettering the areas we live in. We are currently exploring different ways we can further our efforts to include implementing a corporate social responsibility program.
Our status also provides us with unique access to federal contracting mechanisms and preferential treatment allotted specifically for our socio-economic class.
The federal government provides preferential treatment to small businesses to help level the playing field between small and large businesses. One way they achieve this is by awarding various percentages of prime contracting dollars to different categories of small businesses. The government set a goal to award 3% of all federal prime contracting dollars to HUBZone certified small businesses. Another way the government provides assistance is by limiting competition using set-aside contracts that are specifically for small businesses. In contract competitions that are full and open, HUBZone businesses are given a 10% price evaluation preference.
In order to maintain HUBZone status, businesses are required to re-certify for the program (currently every 3 years). A business could re-certify for 100+ years and still maintain status because there is no limit as long as the business continues to qualify. However, the SBA must be notified if there is a change in ownership, business structure, principal office, or if the 35% employee residency requirement is not met.
Global Systems Engineering is proud to be an SBA certified HUBZone small business. Hopefully, with this article clarifying what that means, the words that are used to describe GSE hold significant meaning and provide some incite as to what the company culture of GSE is like.
Dr. Jennifer “Jenny” Stevenson recently celebrated her one-year anniversary with GSE which has been filled with outstanding work and great recognition. Jenny’s educational background is in Library and Information Science with an Archival Concentration, Digital Libraries, and Information Studies. She has a passion for “organizing information and making it discoverable” to the diversity of users.
Jenny’s current work at the Defense Threat Reduction Information Analysis Center (DTRIAC) has received high praise. She has conducted multiple presentations to different user groups on her work with the Department of Defense’s (DoD) nuclear testing archives. The nuclear testing archives contain a collection of millions of different media types including documents, photos, film, test drawings, etc. The DoD nuclear testing archives contain information dating from the Manhattan Project (during WWII) through present day. The leading research question is: “How do you get to the science quickly while creating a space that allows one to evaluate recorded materials and permits accessibility?” Jenny created the Advanced Search and Discovery program (ASD) for the DoD nuclear testing archives. The ASD program is taking the initial step to make the archives easily accessed and available to researchers. The ASD program is making this possible through machine learning and automated arrangement and description. The machine learning will be supervised and seen as a tool to accelerate work rate to reduce backlog. For an undertaking this massive, the traditional archiving process is too slow and would create a backlog; however, the ASD program implements Jenny’s years of archival experience and machine learning to work with the scope of this collection. Jenny’s work is a living breathing catalog that organizes information and makes it readily available to researchers. The machine learning at the core of her Advanced Search and Discovery program is expected to cut more than two decades from the estimated 32 years that traditional cataloging methodology would require.
For her significant contribution, she received an award from SAIC!
The thought of joining the adult workforce is an exciting, yet scary thought for many young adults; to go from knowing everything to realizing how little you actually know. Fortunately for the Junior Research Analysts (JRAs), Global Systems Engineering (GSE) provides real world experience that prepares us for life after college.
GSE recognizes the importance of the Science, Technology, Engineering and Mathematics (STEM) field in the defense of our nation and strives to support the education of young adults in this field. Providing experiences like these to young adults educates and inspires the next generation of great minds that will help defend our nation in the future.
One opportunity presented to the JRAs was attending the Technical Support and Operational Analysis (TSOA) 19-3. This event is held multiple times throughout the year to assess and showcase new technological products for defense in real world scenarios. The one we attended was five days long and there were about 100 people in attendance. Accompanying four of GSE’s full time employees was group of six JRAs on Monday and Thursday; two JRAs attended Tuesday as well.
We were the only people in the room that were primarily attending to observe. This allowed us to really understand the event and how it works. However, it was fairly intimidating.
McKenna Disbrow, a JRA, expressed feeling excited. “I thought the event would be more informational and involve just listening to people talk, but it was a real experience watching the vendors interact with the warfighters and each other to most effectively assess the products.”
The event started with a safety briefing and general overview of what was going to happen and who everyone was. Each vendor had the opportunity to give a 2-minute brief about their product, which was beneficial because it allowed us to understand the uses of each product and how they contribute to the situation being assessed. There were 27 products that were presented at this TSOA that varied in different uses. It was incredible to see the result of the development of new technology and how it can benefit defense.
After everyone was settled and the training began, the JRAs were given a list of potential vendors that GSE would like to see attend the Chemical and Biological Operational Analysis (CBOA) in 2020. The CBOA is a similar event to the TSOA, that GSE also attends, but is specifically for chemical and biological defense products. When looking at vendors and products for these events, one must look to the future and guess what products will best help us in 2028.
The group was divided up according to interest to ensure everyone had the opportunity to network and interact with people who are involved in their field. Networking is a very important aspect when it comes to a professional career. The JRAs were able to practice networking with these working professionals. This allowed them to create a connection that could be useful later in their careers while learning something new.
Nick Francis, another JRA, expressed, “I wish we knew more people here, but making the connections today will help us know people for next time.”
Elizabeth Halford, Senior Program Manager at GSE, thought it would add to the experience for us to have a “Meal Ready to Eat” (MRE) for lunch. Warfighters eat MREs out in the field to get a quick, easy, high-calorie meal to replenish their energy. A chemical reaction takes place that heats up the food so you can eat a hot meal without a kitchen.
Like warfighters do, each of us randomly reached in and picked one out. My first meal was macaroni in tomato sauce. It was not bad and smelled rather appetizing. JRA Pierce Jennings was not so lucky and had a chicken dish that smelled like tuna. There was an abundance of food and snacks available in each bag, so we were able to trade around with each other.
On Thursday, we had the unique opportunity to go down range and watch the “situation” play out in-person. There was a mock-city where situations were set up to most effectively assess the products in a relevant, realistic environment. The warfighters and the “bad guys” took the situation extremely seriously so they could provide accurate feedback about the competence of the products.
Attending this event was an extraordinary experience that was not only fun, but highly educational both academically and socially. The other JRAs and I are incredibly grateful to GSE for this opportunity.
In my career, I have had the pleasure of supporting numerous U.S. Government disaster preparedness, response, and recovery efforts. For the most part, I have observed that these three types of efforts are categorized as separate yet chronological missions – without much overlap in terms of how they are managed or resourced.
You prepare for a disaster. When it occurs, you respond to it. And then after everything settles down, you begin to preform longer-term recovery efforts. In other words, all three mission phases are conducted as standalone efforts. However, I have experienced that this is not necessarily the most effective approach toward successfully managing a disaster.
Each disaster – whether it be a hypothetical one or a real one – is as one of a kind and diverse as the individual personalities of the federal, state, and local representatives responsible for working these three aspects of an incident. Although the incidents and individuals themselves are always unique, there seems to be a theme that transcends these differences: the interconnectedness between preparedness, response, and long-term recovery efforts, and a reasonable necessity for blurred boundaries between these mission areas.
I have observed that effectively managing a disaster depends on establishing a community with a culture of cooperation, regardless of where one mission starts and another mission begins. At an individual level, this often means that, during these three phases of an incident, a proactively collaborative philosophy is put ahead of individual personalities and even mission “swim lanes.”
This collaborative approach requires that those focused on disaster preparedness communicate with and support the response mission, those working on response efforts collaborate and talk to those who will work toward long-term recovery, and both response and recovery representatives help inform future preparedness efforts. The most successful disaster management efforts are not seen exclusively in a preparedness phase, in an effective immediate response mission, or in a well-implemented recovery process – the most effective approach comprises a blended activation of all three phases.
This lesson became clear to me during the 2011 Japanese tsunami and the associated Fukushima nuclear plant disaster. When I was working out of the U.S. Embassy in Tokyo as part of the U.S. response mission following the incident, the Deputy Chief of Mission (DCM) clearly understood that the circumstances demanded a holistic and comprehensive level of support for our foreign partners. Each morning, he held an all hands on deck coordination meeting with a diverse audience of embassy officials and U.S. interagency representatives in the embassy’s first floor auditorium. He started each meeting by asking those with a preparedness mission, those with a response mission, and those with a long-term recovery mission what they were doing to support each other.
A long time has passed since Fukushima, but I still think about the Embassy DCM’s approach toward disaster management. These days, I spend a lot of my time dealing with the daily disasters that occur on a much smaller scale in my own house, and it has taken everything I have learned about disaster response to navigate the risks associated with a life shared with both a one and three year old.
Now that my one year old daughter has developed the ability to swiftly stagger around the house on her own, she has adopted parading over to the dog’s water dish and sadistically flipping it over as her new favorite pastime. After this occurred a few times in a row, my wife started preparing for future incidents by laying large towels on the floor. (Why we didn’t just move the water dish somewhere else is a conversation for another time.) My three year old son, who is currently going through a well-intentioned “I must help” phase, typically responds by running over to the incident and frantically throwing paper towels, napkins, clean diapers, or his socks on the spill. Then I come in and finish where my son left off and restore the floor back to its original non-dog-water flooded condition.
Although not a true disaster, the dog dish being flipped over onto the floor is a type of incident that must be taken seriously. Thus, I realized that the threat of future dog-dish upheaval was one that our household would benefit from learning to cooperatively respond to. I called a family meeting to outline some of the lessons I’ve learned throughout my career about how the three of us – or four if you count the dog – can work more collaboratively together in navigating how we prepare for, respond to, and recover from the inevitable situation. An all hands on deck approach in our family response would make everyone’s job easier, and begin to teach my son the principles of teamwork and effective response from a young age.
I started telling my wife and son about how, back in 2011 in Tokyo, the DCM would specifically ask USAID representatives what they needed in the response phase to be successful in their recovery mission. I described to my family how he would then turn to those supporting the immediate response and ask what preparedness activities and points of contracts were needed to help support them in their immediate near-term mission, and how he would ask those who are responsible for preparedness what lessons could be learned from this crisis that can better help the U.S. and its partners prepare for future incidents.
I explained to my wife and oldest child that, in any form of crisis management, the missions of preparedness, response, and recovery are really part of one larger mission and must strategically overlap to maximize their individual mission efforts.
After I used the Fukushima story to illustrate how everyone in the family can work with each other in the face of this new water dish threat, my wife handed me my daughter and a clean diaper and asked me to respectfully respond to her newest disaster, restore her butt to its previous clean form, and prepare for follow on incidents as she had eaten mashed-up prunes and some playdough for breakfast. I think that was my wife’s way of telling me she appreciates me drawing parallels between a nuclear disaster and my daughter’s habits of making a mess. Meanwhile, my son had stopped listening about halfway through my pontification and began to smash goldfish crackers into the living room carpet, probably as a way to give the family more practice working as a team in managing disasters.
At one point back in 2011, I got a chance to talk one-on-one with the DCM. He took the opportunity to complain about how his March Madness bracket had undergone its own tsunami thanks to an unnamed ACC team out of Tallahassee. However, I used the opportunity to ask him about his approach with disaster management. Although he did not have a lot of disaster management experience, he felt that – with as much capability available to support our allies and as terrible as the disaster was – he simply didn’t want to see the three individual efforts of response, recovery, and future preparedness efforts start and stop independently without allowing for a meaningful transition to take place.
He said that they needed each other to be the best at their respective roles. It was a simple and intuitive answer. I doubt he realized that the experience had help shape the way I think about disaster management. However, still to this day, I believe that holding these relatively intuitional principles ahead of mission introversion and the default human disposition of prioritizing one’s own objectives ahead of another yields an opportunity for larger success.
In March, the residents of Cape Town, South Africa were delivered a reprieve from doomsday when city officials announced that “Day Zero,” when the city will be forced to cut off the taps to city residents, had been pushed back and was no longer predicted to occur in 2018.[i] Extreme water conservation measures, including rationing of 50 liters per resident per day—roughly a sixth of the average American’s daily usage—have slowed sinking reservoir levels, but Cape Town’s future is still dependent on getting decent rainfall this winter. Ironically, counting on the rain was one of the primary causes for the current crisis, as city officials in recent years banked on average historical rainfall patterns holding despite warnings of depleted reservoirs and an increasingly unpredictable climate.
Having grown up in southern California, I’m no stranger to drought and alarming predictions of water scarcity, albeit in less dire circumstances. The state’s water supply has been under constant strain for my entire life. Yet I never saw any impact on southern Californians’ lifestyle until I went home to stay with my parents in Orange County in the summer of 2015, a few months into state-mandated water restrictions. The conservation measures were noticeable, but nowhere near as painful as Cape Town’s. Before the situation got desperate, California was bailed out by an unusually wet winter in 2016-2017 that refilled reservoirs across the state and led to a record snowpack in the Sierra Nevada, the main water source for much of the state.
In hindsight, California got lucky. Cape Town’s ongoing battle to forestall “Day Zero” illustrates the outcome California could have gotten—and could still get. The state’s Water Resources Control Board is considering re-imposing restrictions amid signs that California could be slipping back into drought conditions.[ii] As in South Africa, American water management officials can’t bank on past precipitation patterns holding and have to prepare for an increasingly volatile water supply exacerbated by climate change.
Cape Town is currently scrambling to build four desalination plants and a wastewater processing plant as well as drill new water wells as part of a last-ditch effort, but most of those projects are behind schedule and won’t be operational in time to impact the current crisis. One question raised by California and Cape Town is: why is it that our water management strategy in drought-prone regions is so often reactive rather than proactive?
Water supply management in the western United States admittedly presents an intricate dilemma, lying at the intersection of decades of shortsighted public policy, an increasingly unpredictable climate, unfavorable economic incentive structures, and myriad engineering challenges distributed across interstate, state, regional, and municipal levels. It’s a true system of systems problem.
The water supply problem presents a full slate of challenges and there’s no silver bullet to address them all. But there is clearly a need for a diversified water supply portfolio that includes drought-resistant sources that can provide drinking water on a large scale. Desalination—the process of removing salt and minerals from seawater or brackish water—could be among the potential solutions.
Though it is often decried for its significant power consumption and detrimental environmental impacts, desalination may present one of many potential paths forward for arid, climate-challenged states like California or Texas. Desalination has faced stiff resistance to adoption in the United States, which primarily hinges on the higher price consumers have to pay for drinkable water supplied through desalination compared to other fresh water sources like rivers or groundwater.
Desalination at utility-level production is a highly energy-intensive process using current methods. Seawater reverse osmosis, the most commonly used technique in the United States, requires pumping water through several stages of pretreatment before heating it to high temperatures and forcing it through semipermeable membranes at high temperature and pressure to strain out salt and particulates. These energy requirements add up to half or more of the total cost of desalination in most plants. Some consumers are also concerned about the environmental impacts of desalination, like the effect of discharging the highly saline brine byproduct on ocean life and the greenhouse gas emissions from all the required power production.
Obviously, there’s a wide array of barriers to desalination adoption that must be overcome. Yet recent advances in desalination technology show promise for mitigating these concerns while bringing costs down.
As the National Research Council noted in a seminal 2008 report on desalination that holds true to this day:
“Water scarcity in some regions of the United States will certainly intensify over the coming decades, and no one option or set of options is likely to be sufficient to manage this intensifying scarcity. Desalination, using both brackish and seawater sources, is likely to have a niche in the future water management portfolio for the United States.”[iii]
And a niche it should have. Desalination plants could serve as a hedge against a Cape Town-like crisis, staving off a potential Day Zero in the worst-case situation—if American consumers could be convinced that paying a slight premium for drought-resistant water sources to diversify their water supply portfolio is worth it. But that will likely require more cost parity between desalinated water and other water sources as well as minimizing desalination’s environmental impact to assuage public resistance and allow the technology to be accepted as one piece of a larger climate change adaption strategy.
In a bid to increase efficiency and reduce costs, the desalination industry has already shifted toward collocating desalination plants with coastal power plants in an effort to maximize efficiency: the seawater pumped in as cooling water for the power plant is used for desalination since it is already heated part of the way to the high temperatures required for traditional desalination methods, which reduces energy requirements; construction costs can decrease 5-20% as the two plants share seawater intake and discharge facilities; and less electricity is lost in the transfer between plants because the distance is reduced.
The recently opened Carlsbad Desalination Plant in San Diego County, a great example of a collocated desal plant and now the largest in the nation supplying 50 million gallons (190,000 cubic meters) of drinkable water per day, is considered state of the art by American desalination standards. Yet it also illustrates the enduring challenges to desalination’s acceptance and competitiveness in the United States. The plant is considered relatively efficient in its energy usage and produces water at fairly competitive prices, but it still draws power from the local grid, which is sourced from 70% nonrenewable energy. On top of that, it was built on the site of the 1950s-era, natural-gas burning Encina Power Station, which the county has been trying to shut down. The poor optics helped fuel a slew of criticism and lawsuits against the plant, even though Poseidon Water, the operator, purchased carbon emissions offsets and undertook reforesting programs. Lingering bitterness over Carlsbad has held up development of other desal plants in Southern California.
The American desalination industry would do well to learn from the examples of oil-rich Saudi Arabia and the United Arab Emirates, which ironically are both global leaders in renewable energy-powered desalination. Saudi Arabia’s Al Khafji plant became the world’s first operational, large-scale, solar-powered desalination plant when it came online in November 2017. It features a 15-megawatt panel array using polycrystalline solar cells, which produce enough electricity to support the plant’s desalination of 60,000 cubic meters of water per day although not enough to also serve as a power source for the surrounding community. But the whole project is valued at $130 million, well under the Carlsbad plant’s $1 billion total construction price tag. Saudi Arabia got a third of the production capacity but at 1/8 of the capital cost, in addition to shedding the stigma against burning fossil fuels.
Though critics have charged that the operating costs of solar-powered desal are prohibitively expensive, an ambitious effort underway in Abu Dhabi suggests that solar-powered desalination is becoming increasingly competitive. The Masdar project, bankrolled largely by the UAE’s sovereign wealth fund, aims to create an entire carbon net-zero, self-sufficient city for 50,000 residents. Though the overall effort has run into setbacks and the revelation that being completely net-zero isn’t feasible, a solar-powered desalination pilot program under the effort has shown strong potential for scaling up to help supply the city.[iv] Capitalizing on cheaper and more efficient solar cells would make this project quite affordable—cutting the price of water from solar-powered desalination in half by 2050.[v]
In the United States, such high-risk ventures are less practical without government support. But the federal funding picture changed drastically in FY 2018. The Department of Energy issued a funding opportunity announcement in September 2017 and expects to start making grants totaling $15 million in this fiscal year to support solar-powered desalination research, especially for research into integrating solar desalination systems.[vi] That’s a relatively massive shift by U.S. federal funding standards as the Department of the Interior’s Bureau of Reclamation—historically the largest federal funder of desalination research—expects to grant just $1.2 million under its desalination research program for FY18.[vii] Desalination companies and researchers should seek to leverage this bow wave to reinvigorate renewable energy desalination research in the U.S. and propagate this interest in “clean” desalination. Other barriers remain to making solar-powered reverse osmosis desalination competitive and operating it on a large-scale—such as how to efficiently store solar energy at nighttime when the desal plant still has to run—but those will likely prove surmountable in the face of technological advances.
If American water management officials were to derive any lesson learned from the Cape Town water crisis, it should be that we cannot rely solely on the methods that helped us cope with past droughts and that adapting to future droughts driven by an increasingly erratic climate will require bold, innovative strategic planning that can attract federal research support and overcome public resistance. Integrating renewable energy production with desalination represents exactly that kind of thinking.
[i] David McKenzie and Brent Swails, “Day Zero deferred, but Cape Town’s water crisis is far from over”, CNN, March 9, 2018, https://www.cnn.com/2018/03/09/africa/cape-town-day-zero-crisis-intl/index.html
[ii] Associated Press, “Could California drought restrictions slash water rights? Some think so,” CBS News, Feb. 21, 2018, https://www.cbsnews.com/news/could-california-drought-restrictions-slash-water-rights-some-think-so/
[iii] National Research Council, 2008, Desalination: A National Perspective, Washington, DC: The National Academies Press, https://doi.org/10.17226/12184
[iv] “Renewable Energy Water Desalination Programme,” Masdar, 2018, http://www.masdar.ae/assets/downloads/content/3588/desalination_report-2018.pdf
[v] Richard Martin, “To make fresh water without warming the planet, countries eye solar power,” MIT Technology Review, May 12, 2016, https://www.technologyreview.com/s/601419/to-make-fresh-water-without-warming-the-planet-countries-eye-solar-power/
[vi] “Funding Opportunity Announcement: Solar Desalination,” Department of Energy, Solar Technologies Office, DE-FOA-0001778, Sept. 27, 2017, https://www.energy.gov/eere/solar/funding-opportunity-announcement-solar-desalination
[vii]“Desalination and Water Purification Program for Fiscal Year 2018,” U.S. Department of the Interior, Bureau of Reclamation Research and Development Office, Funding Opportunity Announcement No. BOR-DO-18-F002, Feb. 2018
In the wake of devastating events, such as the recent impact of Hurricane Harvey in Texas, I am reminded of the importance of a comprehensive national strategy for emergency preparedness and response. It is difficult for anyone to remain focused on the broader collaborative picture during any emergent situation, let alone a hurricane that resulted in days of flooding, destruction, and loss of life. However, we have seen time and again that key sectors within government, industry, and relevant non-governmental organizations provide the most value when responding to an incident in a collaborative and coordinated manner. For example, soon after the Arkema chemical plant explosion was reported in the aftermath of Hurricane Harvey, the Environmental Protection Agency (EPA) deployed one of their resources, an Airborne Spectral Photometric Environmental Collection Technology (ASPECT) aircraft, to Crosby, Texas to retrieve chemical information from the resulting smoke cloud to help inform the response efforts. In doing so, the EPA not only fulfilled their sector and agency specific requirements, but also concurrently enhanced the overall strategy and resources of the state and local first responders leading the response effort.
Events like these highlight that there must be a mutual understanding of the goals, requirements, and expertise of sector counterparts, in order to effectively work together. This is a crucial baseline for cohesive emergency preparedness and response planning. In the United States, we refer to this as Whole-of-Government/Whole-of-Community. With our mission and country partners in Southeast Asia, we refer to this as multi-sectoral. Regardless of the terminology, this approach is applicable to all forms of emergency incidents; may they be of natural or man-made origin, accidental or intentional, and across all hazard types.
Lessons Learned: United States
Over the years, the United States has accrued experience and lessons learned in multi-sectoral coordination from numerous events, such as the response to the 9/11 Attacks in 2001 and Hurricane Katrina in 2005. These events of national concern highlight that coordination and collaboration among various sectors and entities, within and outside of government, is a crucial component of effective threat and hazard mitigation.
For strategy, the United States relies upon the National Response Framework (NRF), which is a document that provides guidance for all-hazard incident response in the United States. Preceded by the National Response Plan (NRP), the NRF was developed in part from lessons learned during Hurricane Katrina in 2005, the September 11, 2001 terrorist attacks, the London bombings, and national, state and local exercises.[i] It emphasizes multi-sectoral roles and responsibilities during an incident response. The NRF delineates an approach to government and private sector integration for emergency preparedness, response and recovery efforts, and it advances the notion that governments at all levels, the private sector, non-governmental organizations, and individual citizens share responsibility in incident response.[ii] The NRF relies upon the National Incident Management System (NIMS) to coordinate all phases of emergency management activities among all levels of government, the private sector and non-governmental organizations.[iii]
An example of U.S. multi-sectoral coordination in practice can be found in the U.S. Federal Bureau of Investigation (FBI) and Centers for Disease Control (CDC) Joint Criminal and Epidemiological Investigations (CrimEpi) Program. This program was developed to enable communication, collaboration and coordination between law enforcement and public health communities during a response to a potential biological threat.
Lessons Learned: Malaysia
In Malaysia, national concern regarding multi-sectoral coordination during an emergency response was also prompted by experiences and lessons learned. Various disasters including mudslides, landslides, tropical storm Greg in 1996, and the collapse of the Highland Towers Condominium in 1993 drove the creation of an integrated all-hazard disaster management system in Malaysia.
Malaysia’s National Security (NSC) Directive No. 20 is the mechanism for integrative management of major land all-hazard disasters and incidents. This policy determines the roles and responsibilities of the various stake-holding agencies involved in a disaster response. Through NSC Directive No. 20, a Disaster Management and Relief Committee (DMRC) is established at the national, state and district levels. Each DMRC consists of an interagency and inter-sectoral group of stake-holders. The federal level DRMC formulates policies and strategies, and the state and district level DMRCs implement the national disaster management procedures. NSC Directive No. 20 also requires all government agencies to develop Standard Operating Procedures (SOPs) for disaster prevention, which includes a review and update of their Emergency Response Plans (ERPs).[iv]
An example of a Malaysian multi-sectoral coordination activity is the Biological Incident Response Training and Evaluation (BRITE) program, which was sponsored by the U.S. Defense Threat Reduction Agency (DTRA) Cooperative Biological Engagement Program (CBEP). The BRITE program was developed based off the aforementioned U.S. CrimEpi program, and was tailored to improve the ability of Malaysia’s law enforcement, public health, animal health and community stakeholders in bioincident response efforts. Global Systems Engineering assisted in the implementation of the BRITE program in Malaysia.
A Global Issue
The importance of multi-sectoral coordination in emergency preparedness and response is not an issue specific to one country. Collaborative national emergency management strategy is an issue of global concern. Different countries often experience a diversity of hazard and threat variables due to their specific societal, economical and geographical specificities. While coordination efforts must be tailored to the characteristics of a particular country, the need for a multi-sectoral approach and strategy to all-hazards response efforts remains across international boundaries. For example, the Indian Ocean earthquake and tsunami in 2004, and various international Highly Pathogenic Avian Influenza (H5N1) outbreaks over the years, have emphasized the need for strong multi-sectoral coordination in emergency response efforts.
Unanticipated hazards will arise in the future. As of this post, and just as the Hurricane Harvey response moves into the recovery phase, another hurricane is on the not-so-distant horizon. This Hurricane, “Irma,” has already turned into a category five hurricane, and is headed towards Florida at a steady pace. We continue to see that hazards and threats are almost always unpredictable in timing, scope, and severity. This only serves to further highlight how imperative it is that national multi-sectoral coordination strategies for emergency response must remain flexible, inclusive, and adaptable to emerging unprecedented threats.
The sweeping craze of wearable fitness trackers has enabled increased awareness of health status for everyday users. Combining step trackers with email, calendars, text alerts, and sleep monitoring has made wearable tech an integral part of a daily routine. Add in some friendly competition as people challenge their friends to see who can get the most daily steps, and these devices provide a fun incentive to make sure you are always wearing it. But recently these wearable tech companies have invested in providing more meaningful data than just step tracking and energy expenditure. Sleep quality measurements and heart rate tracking have opened the door to potentially more significant health monitoring. As wearable tech continues to develop, providing a health conscious, competitive game with friends looks like just the beginning.
Warfighter health readiness is constantly at-risk due to biological (naturally occurring or intentionally released) and chemical exposure. The U.S. Armed Forces has a history of encountering infectious diseases in the field. In the past, infectious diseases have caused greater mortality than battle injuries. Melioidosis, nicknamed the “Vietnamese time bomb”, is a notable example of warfighters unknowingly encountering bacteria as helicopters kicked up dirt and hidden pathogens in the tropical soil. Melioidosis is often difficult to diagnose, and can remain latent for years before symptoms actively present. U.S. military personnel were on the ground providing logistical support and training during the Ebola outbreak in West Africa. Outbreaks of Q fever and leishmaniasis in soldiers returning from Iraq and Afghanistan prove biological exposure is not a stand-alone risk, but a ubiquitous threat regardless of time and place. Identifying the presence of disease before the soldier knows they have been exposed provides the opportunity to remove the soldier from immediate threats, treat the soldier quickly, and keep the soldier in condition for action.
If wearable sensors being developed could alert soldiers and leaders to impending illness, before a soldier starts to feel the symptoms of a disease, this information could be integrated with other information as part of the Integrated Early Warning effort under development by the Defense Threat Reduction Agency (DTRA.) Dr. Christian Whitchurch is providing the vision and leading the effort for Wearable Sensing for Chemical and Biological Exposure Monitoring.
One key element of enabling IEW wearable technology is effectively managing data from devices to a platform that can analyze data and create useful data visualizations. As Mike Midgley mentioned in a previous blog post, “the challenge in developing this integrated architecture is not only collecting all of this information in real time from a network of sensors and other data sources, but also enabling the commanders to get the ‘so-what’ to make informed decisions and not become paralyzed by an excess of data”. On the individual level, preemptively identifying illness or heat strain could keep the warfighter healthy and fit for continued work in the field. On an aggregate level, trends in data could show when units are collectively exposed to a threat, or have reached an unsafe level of heat strain. This information would only be useful, however, when packaged into a digestible form.
Equivital Ltd, of Cambridge, UK, is one wearable tech company which has developed a display that uses green, amber, and red indicators to suggest the risk associated with each soldier’s physiological status. Some measures such as body temperature and heart rate may not be very useful on their own as warning indicators, but they can be combined in an algorithm produced by the US Army Research Institute of Environmental Medicine (USARIEM) into a Heat Strain Index (HSI) to alert leaders of potential heat injury. Leveraging data visualization like that of the Equivital™ Black Ghost System (shown below) to display warning for potential illness is the next step.
Integrated early warning using wearable technology would enable timely countermeasures, triage, and exposure notification to not only protect the lives of our warfighters, but to enable mission assurance and provide information for effective decision making. Integrating wearable tech with personalized medicine and military preparedness would best enable our military leadership to make informed and proactive decisions for the benefit of the warfighter. We are only beginning to scratch the surface for the capabilities of this developing wearable technology. Continued multisector coordination between the warfighter, military leadership, and researchers will pave the way for wearable technology to function in a way that best serves all stakeholders. GSE looks forward to supporting this effort as DTRA leads the way for Wearable Sensing for Integrated Early Warning.