Enhanced biological phosphorus removal (EBPR) has been used for decades to remove phosphorus from municipal wastewater because it allows facilities to meet water quality goals while minimizing chemical consumption and sludge production. However, there is still substantial variability in both the practices applied to achieve EBPR and the level of soluble phosphorus removal achieved. The objective of this research project was to develop information that can be used to help municipal wastewater treatment plants more efficiently and cost effectively remove phosphorus through EBPR processes. This project included detailed analysis of routine water quality and operating data, field testing observations, and special studies conducted over the course of the project to evaluate the variability of EBPR, factors influencing EBPR performance, and the relationship between EBPR and the presence of glycogen accumulating organisms (GAOs).

The need to control and remove phosphorus (P) in discharges from waste water treatment facilities (WWTF) to prevent eutrophication of receiving waters is well known. Regulatory initiatives are further contributing to the addition of stringent P limits to discharge permits. Accordingly, many agencies are facing increasingly stringent standards of effluent P and as a result the effluent limit of 0.1 mg P/L total phosphorus (TP) or lower is being achieved. Since the reliable performance limit of enhanced biological phosphorus removal (EBPR) is commonly accepted at about 0.1 mg P/L in the dissolved form, these facilities will need to utilize tertiary chemical phosphorus removal (CPR) to reliably achieve limits lower than 0.1 mg P/L (ultra-low limits) (Pagilla and Urgun-Demirtas, 2007).


The mechanistic basis for P removal using chemical precipitant addition is generally considered to be more than simple precipitation. The role of adsorption and/or surface complexation in removal of reactive or unreactive phosphorus to the already formed chemical precipitates or complexes has been reported. Omoike and VanLoon (1999) showed that the most likely mechanism involves sorption of P species on Al(OH)3(s) formed by Al salt addition to water for P removal, and further precipitation/complexation of P onto the Al hydroxyl precipitate for additional P removal. In tertiary P removal, usually a non-stoichiometric dose or several times stoichiometric dose of the precipitant is needed. The chemical precipitates formed upon addition of such high doses of chemical precipitant are likely to serve as nuclei for further aggregation of newly formed precipitates, thus enhancing the overall tertiary P removal. The excess precipitant added to the secondary effluent produces hydroxyl precipitates which could serve as adsorbents for additional P removal. It is well proven that alum sludge from water treatment plants have high capacity for wastewater effluent P removal depending on their structure and/or age of the sludge solids. The role of mixing, pH, contact time, precipitated solids concentration in the mixing tank, and feed secondary effluent characteristics could influence this process of tertiary P removal.



The investigation of such variables and the process along with careful analytical measurements would explain the basis for precipitation/solids contact P removal from wastewater effluents. Such information could be used to enhance the process and optimize the operating conditions of tertiary P removal by chemical addition.



This paper presents full scale and lab scale investigations of some of the above listed parameters on tertiary P removal with specific focus on role of precipitated solids in enhancing P removal and/or reducing the dose of precipitant to be added to achieve the desired P removal.



The study was conducted at the Iowa Hill Water Reclamation Facility (WRF) and Farmers Korner WWTF in Breckenridge, CO, and at the Illinois Institute of Technology.

Available as eBook only.


This project addresses the successful nitrogen and phosphorus removal technologies being implemented at existing wastewater treatment plants (WWTPs), some key challenges and knowledge gaps in implementing technologies, research needs to improve the existing methods and technologies to achieve low total nitrogen (TN) and total phosphorus (TP) effluents (TN 5 mg/L and TP 0.5 mg/L). Technology and cost assessment of successful technologies for TN and TP removal on a sustainable basis was accomplished by using a "threshold limit" approach to categorize technologies for different effluent TN and TP limits based on desired criteria. Membrane-based process applications were selected among advanced treatment processes for N and P removal from wastewater at pilot and lab scale technology demonstration studies.


The investigations included membrane bioreactor application for N and P removal from wastewater, centrate treatment, and to achieve simultaneous nitrification and denitrification. The results indicated membrane based applications are attractive methods for achieving low TN and TP effluents, but supplementary chemical addition is needed depending on the wastewater characteristics. Full scale implementation of step feed BNR with chemical P removal was demonstrated as a successful technology transfer application at John Egan Wastewater Reclamation Plant of Metropolitan Water Reclamation District of Greater Chicago (MWRDGC).





This study focuses on sustainability impacts as wastewater treatment plants implement treatment technologies to meet increasingly stringent nutrient limits. The objective is to determine if a point of "diminishing returns" is reached where the sustainability impacts of increased levels of nutrient removal outweigh the benefits of better water quality.


Five different hypothetical treatment trains at a nominal 10 mgd flow were developed to meet treatment targets that ranged from cBOD mode (Level 1) to four different nutrient removal targets. The nutrient removal targets ranged from 8 mg N/L; 1 mg P/L (Level 2) to the most stringent at 2 mg N/L; 0.02 mg P/L (Level 5). Given that sustainability is a broad term, the industry-accepted three pillars of sustainability were evaluated and discussed, and particular emphasis was placed on the environmental and economic pillars. The following variables received the most attention: greenhouse gas (GHG) emissions, a water quality surrogate that reflects potential algal growth, capital and operational costs, energy demand, and consumables such as chemicals, gas, diesel, etc. The results from the GHG emissions metric are shown below. Note that biogas cogeneration is represented by negative values as biogas production can be used to offset energy demands. The nitrous oxide (N2O) emissions values are based on the average biological nutrient removal (BNR) and non-BNR plants evaluated in the United States national survey by Ahn et al. (2010b). The error bars represent the data range of the national survey.



The GHG emissions results suggest that a point of diminishing return is reached at Level 4 (3 mg N/L; 0.1 mg P/L). The GHG emissions show a steady increase from Levels 1 to 4, followed by a 65% increase when moving from Level 4 to 5. Despite a 70% increase in GHGs, the discharged nutrient load only decreases by 1% by going from Level 4 to 5. The primary contributors to GHG emissions are energy related (aeration, pumping, mixing). The GHG emissions associated with chemical use increases for the more stringent nutrient targets that required chemical treatment in addition to biological nutrient removal. In terms of cost, the total project capital cost increases approximately one-third from $9.3 million to $12.7 million for changing from Level 1 to 2, followed by a more than doubling in cost when changing from Level 1 to 5. Total project capital costs in this report are for a Greenfield plant. The operational cost increase between levels is more pronounced than total project capital cost with more than five-times increase from Level 1 to 5 ($250/MG treated to $1,370/MG treated, respectively).



This report focused on in-plant (point source) options for nutrient removal and the implications for cost and sustainability. Other approaches, such as addressing non-point sources, could be added to the assessment. Rather than focusing strictly on point source dischargers and requiring Level 4 or 5 treatments, Level 3 or 4 treatments complimented with best management practices of non-point sources might be a more sustainable approach at achieving comparable water quality.

The development of this WERF anaerobic digester (AD) foaming guidance document is based on the need for a specific detailed methodology that water reclamation and recovery facilities (WRRF) personnel can follow and implement for management of their AD foam incidents. This guidance manual is the final product following a comprehensive effort research effort on AD foaming in full-scale WRRFs. It is the outcome of lessons learnt from extensive literature review, WRRF surveys, full-scale data studies and experimental analyses. The primary goals of this document are to: 1) serve as a practical and usable tool for WRRFs; 2) to present feasible alternatives that are cost-effective; 3) to present large amounts of practical, focused information in a concise manner; 4) to present information of sound technical and scientific basis that can become a standard or foundation for foam management in WRRFs.

This guidance document incorporates the systematic approach for integrated capacity assessment of a wastewater treatment plant and identifying performance limiting factors. Based on evaluation by the project team, this document presents the "best-practices" approach that defines a systematic process for applying analytical methods and testing tools to optimize and upgrade wastewater treatment plants.


This guidance document suggests a systematic approach for capacity evaluation, which has three generalized stages: desktop evaluation as an initial plant assessment; detailed evaluation for individual unit process analysis and testing, along with integrated evaluation of the whole plant; and field scale testing as confirmative testing for implementation of corrective actions. The document covers analysis, testing methods, and protocols for plant monitoring. It describes the integrated hydraulics and process modeling approach for determining plant capacity. The document addresses currently practiced treatment processes, evaluation tools, and equipment in the industry. Individual unit processes in liquid treatment and solids handling are illustrated with detailed process analysis and testing, performance limiting factors, approaches for overcoming potential constraints, capacity assessments, and case studies. The document also provides recommendations and sources of detailed information for testing procedures.

Available as eBook only.


Phosphorus measurements at very low concentrations have been tested and proven to be unreliable. The establishment of stricter phosphorus discharge requirements has challenged the wastewater facilities to be capable of measuring low phosphorus concentrations (20 g/L) in the effluent. The major challenge associated with low phosphorus measurements seems to be related to the sample matrix and the digestion methodologies. The results of total phosphorus measurements in wastewater effluent, high quality deionized water samples, and orthophosphate measurements in wastewater effluent samples show a broad variability. In contrast, the orthophosphate measurements in the deionized water matrix spiked to 3 g/L and 6 g/L show an insignificant variability.


The Ascorbic Acid Method seems to be a reliable technique to measure orthophosphate at low levels. The findings of the current study have demonstrated that as the phosphorus concentration increases, the variability decreases. A good reliable analytical process is needed to provide information to regulators for setting sound permit limits and by utilities required to meet those limits. Current methods show significant variability at very low (20 g/L) concentrations when measuring total phosphorus and orthophosphate. This study provides important information regarding the capability of wastewater and commercial laboratories to determine low phosphorus concentration (0-20 g/L) accurately. The findings of this study raises some important questions regarding establishing permit limits and the ability of utilities to comply with the limits. Measurements to comply with very low limits (20 g/L TP) will inherently vary, making it impossible to determine both the environmental impact of the discharge stream and the performance of the utility.

The overall aim of this research was to evaluate membrane bioreactor (MBR) process designs to meet low effluent nitrogen and/or phosphorus concentrations from municipal wastewater treatment plants (WWTPs) and to identify design and operating issues that are unique to the application of MBR technology for achieving a high level of nutrient removal, such as effluent concentrations for total nitrogen (TN) below 3 to 6 mg/L and for total phosphorus (TP) below 0.1 to 1.0 mg/L.


The application of membrane bioreactors has increased rapidly over the past two decades, expanding knowledge and experience with the technology. Textbooks and other publications are available that provide fundamental information on membrane separation, terminology specific to MBRs, as well as information on the process and detailed design of MBR systems. The fundamentals and design considerations of biological nutrient removal systems have also been documented. This document does not provide the above mentioned details on the MBR technology or nutrient removal, and the reader is referred to the texts below for such information. However, key definitions of process terminology are defined where it is important to provide clarity for the reader.