IN.gov - Skip Navigation

Note: This message is displayed if (1) your browser is not standards-compliant or (2) you have you disabled CSS. Read our Policies for more information.

Nonpoint Source Water Pollution

nps_1_newsticker_compendium

Outline of Quality Assurance Project Plan Requirements

The outline below is the criteria used by IDEM staff to evaluate the Quality Assurance Project Plans submitted by Section 319 and 205j grantees.

1.0: Title and Approval Sheet

  1. Title, 319 Grant, and EDS #.
  2. Organization’s name.
  3. Name, dated signature of project manager/coordinator/quality assurance officer.
  4. Name, dated signature of IDEM Quality Assurance Manager.
  5. Name, dated signature of IDEM NPS/TMDL Section Chief.
  6. Name, dated signature of IDEM Watershed Assessment and Planning Branch Chief.

2.0: Table of Contents

3.0: Appendices and Tables

    Appendices.
  1. Tables.
  2. Distribution List.

4.0: Project/Task Organization

  1. Key Personnel - Identifies key individuals, with their responsibilities in each participating organization: Please provide mailing address, email address, and telephone number.
  2. Project Organizational Chart - Provides an organizational chart, showing the direct and ‘dotted’ lines of authority among and within the organizations.

5.0: Special Training Needs/Certification & Qualifications

  • Identifies special needs, and how they will be provided, documented, and assured.

6.0: Problem Definition/Background

  1. Problem Statement
    1. Clearly states problem to be resolved or decision to be made.
    2. Indicates intended use of the data, the decisions to be made with the information.
    3. Describes action levels or standards to be used to make decisions.
    4. Identifies expected data users.
  2. Historical and Background Information
    1. Provides historical and background information including a list of know pollutants, and critical areas.
    2. Provides 303d listed information including date listed, waterbody, segments, and pollutants.
    3. Indicate the need for this work.
    4. Describes previous work or data collected, as they relate to this project. Include data locations, 319 projects.
    5. Describes approach, connecting what is needed with how it will be obtained.
    6. List goals and objectives for collecting environmental samples.
    7. Design monitoring project to collect enough data to identify relationships between sources and causes.

7.0: Process Design

  • Link sampling design to goals and objectives.
  • Clearly describes if volunteer analysis or analytical laboratory analysis.
  1. Study site description
    1. Include maps with site numbers in QAPP. Include watershed boundaries for evaluation and source control and delineate drainage areas at desired scale.
    2. Identifies data that will be obtained from other sources (secondary/ indirect) data.
    3. Includes appropriate technical, regulatory, program-specific quality standards.

8.0: Quality Objectives and Criteria for Measurement Data

  1. Goal Statements & Objective Statements
    1. Describes how the quality objectives for the project were determined – systematic planning process.
    2. If the Data Quality Objectives (DQO) process was followed, describes it and results, and attaches or references documentation.
  2. Study Site - Briefly describe site. Include road information.
  3. Sampling Design
  4. Study Timetable

9.0: Data Quality Indicators

  1. Precision - Describes how each measurement will be determined, and the acceptance criterion for each.
  2. Accuracy and/or Bias - Describe how each measurement will be determined, and the acceptance criterion for each. A blank sample could be used for bias/accuracy in the field.
  3. Completeness - Give level of completeness required for study and whether sufficient resources will be allocated to ensure project completion.
  4. Representativeness - Describes how collected data will accurately represent the population or parameter being measured, tying each to the monitoring design in section 8.1.
  5. Comparability - Clearly state what standards and /or data sets the new data will be compared with and state goals for achieving data comparability.
  6. Sensitivity - State sensitivity (detection limit) goals for each parameter (relate to comparability) and discuss appropriateness for the project. A table with detection limits can be used.

10.0: Non-direct Measurements (secondary Data)

  1. Identify types of existing data needed and the expected sources (computer data bases, literature files, other project reports), along with acceptance criteria for their use.
  2. Specify acceptance criteria for the use of the data.
  3. Discusses limitations of such data and how they will be handled.
  4. Document the rationale for the original data collection and its relevance to this project.

11.0: Monitoring Requirements

  1. Monitoring Process Design
    1. Describe how and why monitoring design will accomplish goals, (justify design rationale), connecting with problem definition (Section 6.0).
    2. Describe how spatial and temporal variability will be accounted for.
    3. Identify the monitoring points with description of the location for each point. Include latitude/longitude or decimal degree.
    4. Identify monitoring frequency, schedule and brief description of each site and why it was selected. Tables and maps may help. May reference them elsewhere in QAPP.
    5. Prepare Excel chart to document the following information for each sample: Sample Number, Site number, date, time, parameters, method number, detection limit, latitude, longitude & method, result, unit, holding time, and validation flags, DQAL.
    6. Stream flow and not just stream speed should be included.
    7. Discuss how location information will be obtained (GPS).
  2. Monitoring Methods
    1. Fully describe all monitoring methods, include field measurements, continuous monitoring, remote sensing, referencing or attaching SOPs.
    2. Lists all needed monitoring equipment and supplies.
    3. Identifies what to do when problems arise.
    4. If samples are to be a composite, homogenized split, etc.
    5. For continuous monitoring, state averaging time, averaging method, data logging, downloading, storing, and reporting (telemetering) procedures.
    6. Describes all data acquisition, handling equipment, software, and how it will be tested and verified.
    7. For remote sensing, indicate area to be imaged, spatial resolution, degree of overpass.
    8. Describes cleaning and decontamination of field equipment, and how it will be verified.
  3. Site Description - Include a site map of each sampling location with other geographic features relevant to the study.
  4. Field QC Activities - Identifies Field QC activities (replicates, field or trip blanks, and splits), their purpose, frequency acceptance criteria, and corrective actions.

12.0: Analytical Requirements

  1. Analytical Methods - Identifies analytical methods options to be followed and provides validation information for non-standard methods.
  2. Analytical QC Activities - Identifies all required laboratory QC checks, their purpose, frequency, acceptance criteria, and corrective actions if acceptance criteria exceeded.

13.0: Sample Handling and Custody Requirements

  1. Describes logistics of sample handling from collection through disposal.
  2. For each sample matrix and parameter, specifies the number of samples, volumes, containers, preservation, and allowable holding times. Can use reference table in another section.
  3. For in situ, continuous, and remote monitoring, include handling of measurement records.
  4. States requirements for sample archiving and disposal.
  5. Describes sample identification and chain of custody procedures, including samples labels, forms, etc.

14.0: Testing, Inspection, Maintenance, and Calibration

  1. Instrument/Equipment/s Supplies Testing and Maintenance Requirements.
  2. Describes the need for and frequency of equipment calibration and maintenance.
  3. Identifies inspection and acceptance criteria for field, lab, and data management equipment and supplies.

15.0: Assessments/Oversight / Data Quality Assessment & Decision Rules

  1. Data Quality Indicators
    1. Lists required number, frequency & type of assessments such as Data Quality Indicators (Precision, Accuracy/Bias, and Completeness), describe procedure to assess the indicators, list acceptable limits and what are the decision statements?
    2. Identifies individuals responsible for performing such assessments, whether they will be independent, and how the information will be reported.
  2. Corrective Action - Identifies individuals responsible for corrective actions, and how it will be tracked.

16.0: Performance & System Audits

17.0: Preventative Maintenance

18.0: Data Review, Verification, Validation, and Reconciliation with DQIs

  1. Data Review and Verification - State criteria for accepting, rejecting, or qualifying data.
  2. Validation & Qualifiers - Describes processes for data validation & verification for qualifying data.
  3. Reconciliation with User Requirements
    1. Dscribes how results (validated data) will be reconciled with requirements defined by users, State who is responsible for it, what, if any statistical procedures will be used.
    2. Includes both field and lab issues. States how any limitations on use of data will be reported.
  4. Modeling or Statistical Methods Used

19.0: Reports to Management, Documentation, Records

  • Describes process for managing project documents and records.
  • Identifies frequency and distribution of reports, along with names of originators.
  • Itemizes what information and records must be included in final report and all intermediate reports, such as: project status, results of assessments, and significant QA problems.
  • Identifies where raw data, logs and final report will be located and in what form.
  • Identifies how data can be retrieved at a later date and length of time they must be retained.
  1. Data Reporting
  2. Data Management
    1. Describes data management throughout the project, including: record keeping, transformation, reduction, storage, retrieval, and security.
    2. Describes data handling equipment and procedures used to process, compile, error-check, and analyze data
    3. Agreed to fill in Excel data sheet supplied by NPS group.
  3. Data Quality Assessment Levels
  4. D9 Table from Assessment Branch
  5. Quality Assurance Reports

20.0: References

21.0: Appendices