{"title":"The need for speed – rapid evolution of microbiological testing in drinking water","authors":"P. Whalen, Dan Kroll, P. Gallant","doi":"10.2166/9781780408699_0015","DOIUrl":null,"url":null,"abstract":"s and new works from the Work, to alter and revise the Work, and to make commercial use of the Work, provided the user gives appropriate credit (with a link to the formal publication through the relevant DOI), provides a link to the licence, and that the licensor is not represented as endorsing the use made of the work. The full details of the licence are available at http://creativecommons.org/licenses/by/4.0/. The chapter is from the book Microbiological Sensors for the Drinking Water Industry, Torben Lund Skovhus and Bo Højris (Eds.). doi: 10.2166/9781780408699_0015 Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 16 Microbiological Sensors for the Drinking Water Industry was born. Into the 20th century, the widespread adoption of microbiological testing, coupled with water filtration and later chlorination, resulted in the most significant decline in mortality rates in modern history (Cutler & Miller, 2005). Despite early rapid advances, microbiological testing methods such as plategrowth methods remained largely unchanged until the mid-20th century, when biochemical and molecular-based test methods were first developed. The reasons for these developments were many: a desire for more rapid results, greater specificity, greater objectivity and ease-of-use by non-skilled workers, and the continual search for the “holy grail” – sensors embedded on-line in the water distribution system that can detect and characterize bacteriological targets in real-time, all the time. By the early 21st century, science had entered the golden age of microbiological test method development with a multitude of different technology options for achieving one or more of the above-mentioned goals. However, scientists have still not found the “perfect” sensor technology, and it is not obvious that a single technology is emerging as the leading contender for widespread deployment in public water supply networks. Furthermore, there is a wide gap between the available technology, end-user capability to deploy and manage sensor networks, and the nature of the regulatory compliance environment such that even if the perfect microbiological sensor existed, it would be a struggle to deploy rapidly across the industry. This chapter will provide both a historical review of microbiological detection technologies as well as an overview of selected biosensor technologies for water quality monitoring, with a focus on the challenges that must be overcome to ensure the successful deployment of advanced microbiological sensor technologies in water supply networks. 2.2 ANCIENT AND MEDIEVAL TIMES – EARLY MICROBIOLOGICAL SENSING Without water, life cannot exist. It is therefore no surprise that human civilization developed close to sources of fresh water. Though water quantity was typically the deciding factor in where communities were founded, records indicate that our forerunners realized the benefits of a high-quality water supply thousands of years ago. For example, Sanskrit writings document the use of charcoal (now activated carbon) (Enzler, 2018) and Egyptian hieroglyphs mention the use of alum (USEPA, 2000), both of which are still in use today (Figure 2.1). Greek writings and even the Bible mention the use of filters to remove impurities (APEC Water, 2013). In general, these treatments were used to improve the aesthetic quality of water and led to the theory that if it is clean, it is safe to drink. Unfortunately, that theory was not necessarily always true due to the hidden microbiological threats that lurked beyond view of the human eye. Indeed, other cultures such as the Chinese inadvertently discovered how to make water microbiologically safe through other means – that being, tea. By boiling water to make tea, they also disinfected the water. In Medieval Europe, there was at least an anecdotal awareness Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 Rapid evolution of microbiological testing in drinking water 17 of microbiological contamination risks in even clean-appearing water that led many to quench their thirst with wine or beer (Harris & Grigsby, 2009). In some ways, these actions were in response to results gleaned from the first microbiological sensors – gross observation of the impact contaminated water had on other people. Figure 2�1 Drawings on the walls of Egyptian rulers Amenophis II and Rameses II (APEC Water, 2013). Gross observation also led civilizations to seek sources of water elsewhere when waterborne illness occurred. Similarly, when water quantities became limited and began to restrict the growth of cities, methods were developed to bring in additional sources of safe water. In Rome a series of aqueducts (Figure 2.2) were developed over a 500-year period and allowed it to become the largest city of its time, far larger than the water resources within its periphery were able to sustain both from a perspective of quantity and quality (Enzler, 2018). During the industrial revolution, western civilization grew at an exponential rate – especially in urban centres. This put pressure on these major population centres to supply sufficient quantities of clean water to sustain that population growth. Some cities resorted to building extensive water supply networks to bring water from far away locations, similar to the Roman aqueducts. Others began experimenting with methods that could convert contaminated water into clean water, such as slow sand filters which were first deployed in Scotland in the early 1800s (Blake, 1956). Neither of these solutions, however, brought forth a direct and impactful public awareness to the risks of microorganisms in drinking water supplies and their linkage to the water cycle. Ironically, it was the birth of the water closet that did exactly that. Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 18 Microbiological Sensors for the Drinking Water Industry Figure 2�2 Roman Aqueducts (Cartwright, 2012). 2�3 19TH CENTURY – LINKING THE WATER CYCLE TO HUMAN HEALTH In 19th century England, the cesspit was a ubiquitous piece of infrastructure designed to capture and store human waste underground (Morris, 2009). The next phase of waste-handling technology, the water closet – invented in the late 18th century (Hardy, 1984) – was becoming more popular. To handle the elevated water flows from this new apparatus, city planners and engineers began to install modern sewer systems to allow these cesspits to drain more rapidly and not overflow. The drainage point of these systems were natural bodies of water, such as the Thames river. Starting in the 1830s and continuing through to the 1860s, large swathes of London were overcome by outbreaks of cholera (Morris, 2009). The leading theory of the day was that a cloud of sickness had descended upon the city. This Miasmatic Theory (Halliday, 2001) had been used to explain such epidemics in the past, which seemed to come and go with the changing of the seasons. This time, however, the outbreaks did not stop despite seasonal change. The physician John Snow began to explore the situation with great interest. His research began in 1849 and came to a head with the 1854 Soho epidemic (Morris, 2009). He found that most instances of illness and resulting fatalities occurred Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 Rapid evolution of microbiological testing in drinking water 19 near certain cisterns. He further found that fatalities not in that specific area often correlated with families that obtained water from the same cisterns, mainly on the basis that they preferred the taste of that water. He thus concluded that this was not a case of miasma outbreaks; rather, it was a waterborne outbreak. After several months of investigation, two sources of contamination were identified. First, the integrity of some cesspits had failed, with their contents leaching into cisterns within their proximity. Second, the water source of some cisterns (namely, those being fed by draw points from the Thames river used by the Southwark and Vauxhall Waterworks Company) were located downstream of sewer system discharge points. It was thus concluded that the source of these outbreaks was human faecal material. Snow’s study (Figure 2.3) was a major turning point in the history of public health and is regarded as the founding event of the science of epidemiology (Morris, 2009). Figure 2�3 Original map of cholera cases in the Soho epidemic of 1854 (Barton, 2018). Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 20 Microbiological Sensors for the Drinking Water Industry Snow’s revolutionary work instigated significant political controversy, and his theories were not widely accepted by the time of his death in 1858. However, further debate and study resulted in full acceptance even by his most prominent opponents such as William Farr in 1866. In the following years, work done by other prominent researchers including Louis Pasteur (France) and Robert Koch (Germany) led to the birth of today’s most common microbiological sensors: microscopes and growth-based culture tests. It came to be in the 1880s that the Germ Theory of disease overtook Miasmatic Theory as the leading explanation for human infection (USEPA, 2000). A “golden era” of bacteriology ensued, in which the theory quickly led to the identification of the actual microorganisms that cause many diseases, waterborne or otherwise. On the back of these discoveries, several advances in water and wastewater treatment gained increasing adoption. Filtration was identified as a means to improve water quality prior to human consumption. Slow sand filtration was developed in the United Kingdom in the early 1800s and improved upon in","PeriodicalId":291820,"journal":{"name":"Microbiological Sensors for the Drinking Water Industry","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Microbiological Sensors for the Drinking Water Industry","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2166/9781780408699_0015","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
s and new works from the Work, to alter and revise the Work, and to make commercial use of the Work, provided the user gives appropriate credit (with a link to the formal publication through the relevant DOI), provides a link to the licence, and that the licensor is not represented as endorsing the use made of the work. The full details of the licence are available at http://creativecommons.org/licenses/by/4.0/. The chapter is from the book Microbiological Sensors for the Drinking Water Industry, Torben Lund Skovhus and Bo Højris (Eds.). doi: 10.2166/9781780408699_0015 Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 16 Microbiological Sensors for the Drinking Water Industry was born. Into the 20th century, the widespread adoption of microbiological testing, coupled with water filtration and later chlorination, resulted in the most significant decline in mortality rates in modern history (Cutler & Miller, 2005). Despite early rapid advances, microbiological testing methods such as plategrowth methods remained largely unchanged until the mid-20th century, when biochemical and molecular-based test methods were first developed. The reasons for these developments were many: a desire for more rapid results, greater specificity, greater objectivity and ease-of-use by non-skilled workers, and the continual search for the “holy grail” – sensors embedded on-line in the water distribution system that can detect and characterize bacteriological targets in real-time, all the time. By the early 21st century, science had entered the golden age of microbiological test method development with a multitude of different technology options for achieving one or more of the above-mentioned goals. However, scientists have still not found the “perfect” sensor technology, and it is not obvious that a single technology is emerging as the leading contender for widespread deployment in public water supply networks. Furthermore, there is a wide gap between the available technology, end-user capability to deploy and manage sensor networks, and the nature of the regulatory compliance environment such that even if the perfect microbiological sensor existed, it would be a struggle to deploy rapidly across the industry. This chapter will provide both a historical review of microbiological detection technologies as well as an overview of selected biosensor technologies for water quality monitoring, with a focus on the challenges that must be overcome to ensure the successful deployment of advanced microbiological sensor technologies in water supply networks. 2.2 ANCIENT AND MEDIEVAL TIMES – EARLY MICROBIOLOGICAL SENSING Without water, life cannot exist. It is therefore no surprise that human civilization developed close to sources of fresh water. Though water quantity was typically the deciding factor in where communities were founded, records indicate that our forerunners realized the benefits of a high-quality water supply thousands of years ago. For example, Sanskrit writings document the use of charcoal (now activated carbon) (Enzler, 2018) and Egyptian hieroglyphs mention the use of alum (USEPA, 2000), both of which are still in use today (Figure 2.1). Greek writings and even the Bible mention the use of filters to remove impurities (APEC Water, 2013). In general, these treatments were used to improve the aesthetic quality of water and led to the theory that if it is clean, it is safe to drink. Unfortunately, that theory was not necessarily always true due to the hidden microbiological threats that lurked beyond view of the human eye. Indeed, other cultures such as the Chinese inadvertently discovered how to make water microbiologically safe through other means – that being, tea. By boiling water to make tea, they also disinfected the water. In Medieval Europe, there was at least an anecdotal awareness Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 Rapid evolution of microbiological testing in drinking water 17 of microbiological contamination risks in even clean-appearing water that led many to quench their thirst with wine or beer (Harris & Grigsby, 2009). In some ways, these actions were in response to results gleaned from the first microbiological sensors – gross observation of the impact contaminated water had on other people. Figure 2�1 Drawings on the walls of Egyptian rulers Amenophis II and Rameses II (APEC Water, 2013). Gross observation also led civilizations to seek sources of water elsewhere when waterborne illness occurred. Similarly, when water quantities became limited and began to restrict the growth of cities, methods were developed to bring in additional sources of safe water. In Rome a series of aqueducts (Figure 2.2) were developed over a 500-year period and allowed it to become the largest city of its time, far larger than the water resources within its periphery were able to sustain both from a perspective of quantity and quality (Enzler, 2018). During the industrial revolution, western civilization grew at an exponential rate – especially in urban centres. This put pressure on these major population centres to supply sufficient quantities of clean water to sustain that population growth. Some cities resorted to building extensive water supply networks to bring water from far away locations, similar to the Roman aqueducts. Others began experimenting with methods that could convert contaminated water into clean water, such as slow sand filters which were first deployed in Scotland in the early 1800s (Blake, 1956). Neither of these solutions, however, brought forth a direct and impactful public awareness to the risks of microorganisms in drinking water supplies and their linkage to the water cycle. Ironically, it was the birth of the water closet that did exactly that. Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 18 Microbiological Sensors for the Drinking Water Industry Figure 2�2 Roman Aqueducts (Cartwright, 2012). 2�3 19TH CENTURY – LINKING THE WATER CYCLE TO HUMAN HEALTH In 19th century England, the cesspit was a ubiquitous piece of infrastructure designed to capture and store human waste underground (Morris, 2009). The next phase of waste-handling technology, the water closet – invented in the late 18th century (Hardy, 1984) – was becoming more popular. To handle the elevated water flows from this new apparatus, city planners and engineers began to install modern sewer systems to allow these cesspits to drain more rapidly and not overflow. The drainage point of these systems were natural bodies of water, such as the Thames river. Starting in the 1830s and continuing through to the 1860s, large swathes of London were overcome by outbreaks of cholera (Morris, 2009). The leading theory of the day was that a cloud of sickness had descended upon the city. This Miasmatic Theory (Halliday, 2001) had been used to explain such epidemics in the past, which seemed to come and go with the changing of the seasons. This time, however, the outbreaks did not stop despite seasonal change. The physician John Snow began to explore the situation with great interest. His research began in 1849 and came to a head with the 1854 Soho epidemic (Morris, 2009). He found that most instances of illness and resulting fatalities occurred Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 Rapid evolution of microbiological testing in drinking water 19 near certain cisterns. He further found that fatalities not in that specific area often correlated with families that obtained water from the same cisterns, mainly on the basis that they preferred the taste of that water. He thus concluded that this was not a case of miasma outbreaks; rather, it was a waterborne outbreak. After several months of investigation, two sources of contamination were identified. First, the integrity of some cesspits had failed, with their contents leaching into cisterns within their proximity. Second, the water source of some cisterns (namely, those being fed by draw points from the Thames river used by the Southwark and Vauxhall Waterworks Company) were located downstream of sewer system discharge points. It was thus concluded that the source of these outbreaks was human faecal material. Snow’s study (Figure 2.3) was a major turning point in the history of public health and is regarded as the founding event of the science of epidemiology (Morris, 2009). Figure 2�3 Original map of cholera cases in the Soho epidemic of 1854 (Barton, 2018). Downloaded from https://iwaponline.com/ebooks/chapter-pdf/494570/9781780408699_0015.pdf by guest on 29 July 2019 20 Microbiological Sensors for the Drinking Water Industry Snow’s revolutionary work instigated significant political controversy, and his theories were not widely accepted by the time of his death in 1858. However, further debate and study resulted in full acceptance even by his most prominent opponents such as William Farr in 1866. In the following years, work done by other prominent researchers including Louis Pasteur (France) and Robert Koch (Germany) led to the birth of today’s most common microbiological sensors: microscopes and growth-based culture tests. It came to be in the 1880s that the Germ Theory of disease overtook Miasmatic Theory as the leading explanation for human infection (USEPA, 2000). A “golden era” of bacteriology ensued, in which the theory quickly led to the identification of the actual microorganisms that cause many diseases, waterborne or otherwise. On the back of these discoveries, several advances in water and wastewater treatment gained increasing adoption. Filtration was identified as a means to improve water quality prior to human consumption. Slow sand filtration was developed in the United Kingdom in the early 1800s and improved upon in