Environmental Science Technology AI Policy: ChatGPT and Generative AI Disclosure Rules for ES&T Authors
Environmental Science & Technology (ACS) requires AI disclosure under ACS rules. AI cannot be an author. This guide covers where to disclose, what to disclose, and the consequences of non-compliance for ES&T submissions.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Environmental Science & Technology at a glance
Key metrics to place the journal before deciding whether it fits your manuscript and career goals.
What makes this journal worth targeting
- IF 11.3 puts Environmental Science & Technology in a visible tier — citations from papers here carry real weight.
- Scope specificity matters more than impact factor for most manuscript decisions.
- Acceptance rate of ~~25-30% means fit determines most outcomes.
When to look elsewhere
- When your paper sits at the edge of the journal's stated scope — borderline fit rarely improves after submission.
- If timeline matters: Environmental Science & Technology takes ~~90-120 days median. A faster-turnaround journal may suit a grant or job deadline better.
- If open access is required by your funder, verify the journal's OA agreements before submitting.
Quick answer: The Environmental Science Technology AI policy follows ACS's rules calibrated to environmental science research submissions. AI tools can be used for manuscript preparation but every use must be disclosed in the Methods section, with ES&T's editorial team checking specifics at desk-screen. AI cannot be listed as an author of any ES&T paper. AI-generated figures and schematics representing original research data are prohibited under ES&T's image-integrity standard. Environmental Science & Technology (ACS) editors treat undisclosed use as a publication-ethics violation per ICMJE + COPE.
Run the ES&T submission readiness check which includes an automated AI-disclosure audit, or work through this guide manually. Need broader context? See the ES&T journal overview.
The Manusights ES&T readiness scan. This guide tells you what Environmental Science & Technology (ACS)'s editors look for when verifying AI disclosure at desk-screen. The scan tells you whether YOUR Methods section has the required language before you submit. We have reviewed manuscripts targeting Environmental Science & Technology (ACS) and peer venues; the named patterns below are the same ones Shelley Hearne and ACS Publications AI Committee flag at the desk-screen and editorial-board consultation stages. 60-day money-back guarantee. We do not train AI on your manuscript and delete it within 24 hours.
Editorial detail (for desk-screen calibration). Editor-in-Chief: Shelley Hearne (ACS) leads Environmental Science & Technology editorial decisions. Editorial-board listings change; verify the current incumbent at the journal's editorial-team page before quoting the name in a submission cover letter. Submission portal: https://acs.manuscriptcentral.com/est. Manuscript constraints: 200-word abstract limit and 8,000-word main-text cap (ES&T enforces during desk-screen). We reviewed ACS's AI policy framework against current ES&T author guidelines (accessed 2026-05-08); evidence basis includes both publicly documented ACS policy and our internal anonymized submission corpus. The applicable word limit at ES&T is shown below: 200-word abstract limit and 8,000-word main-text cap (ES&T enforces during desk-screen).
The manuscript word limit at this journal is 8,000 words for main text (verify article-type-specific caps in the latest author guidelines). The named editorial-culture quirk: ES&T reviewers expect both quantified environmental-data and explicit policy or treatment-technology relevance; mechanism-only or descriptive-only papers extend revision.
What does Environmental Science & Technology (ACS)'s AI policy require?
ES&T authors must follow four rules under ACS's AI framework, all enforced at desk-screen:
Rule 1: Disclose every AI tool used in manuscript preparation
Authors must name every generative AI tool used, its version, and how it was used. The disclosure goes in the Methods section, not the Acknowledgments. Examples that REQUIRE disclosure at ES&T:
- For ES&T-targeted manuscripts addressing environmental science research: using ChatGPT, Claude, Gemini, or similar to draft, polish, or edit manuscript text passing through ES&T editorial review
- For ES&T submissions: using AI to generate boilerplate text for limitations, ethics statements, or ES&T-specific response-to-reviewers letters that cite ACS's framework
- For Environmental Science & Technology (ACS) submissions: using AI to translate manuscript text into English from another language, with ACS expecting disclosure of the source language and translation chain
- For ES&T literature reviews: using AI for citation discovery or summarizing prior ES&T work; ACS's policy applies regardless of citation context
- For ES&T analytical pipelines: AI-assisted code generation requires Methods + code disclosure under ICMJE + COPE, particularly when code touches environmental science research analysis
Examples that do NOT require AI disclosure:
- At ES&T, using grammar/spell checkers (Word, Grammarly basic) that do not generate new content for the manuscript
- For ES&T submissions, using reference managers (Zotero, EndNote) for citation formatting against ACS's style guide
- For Environmental Science & Technology (ACS) statistical analysis, using established statistical software (R, Stata, SPSS) where the algorithm is the established tool documented in ES&T's methodological norm, not a generative AI
Rule 2: AI cannot be an author
No AI tool can be listed as an author of a ES&T paper, particularly for environmental science research-class submissions. Under ACS's policy: authorship requires the ability to take responsibility for the content, agree to be accountable for accuracy, and to consent to publication. AI tools cannot do any of these in ES&T's editorial framework. This rule is consistent across all ACS-published journals and applied at ES&T's desk-screen.
Rule 3: AI-generated figures are prohibited for original research data
Environmental Science & Technology (ACS) editorial team does not accept AI-generated images, figures, or schematics that represent original research data in environmental science research-class submissions. AI tools may assist with figure layout (axis labeling, color schemes) but the underlying data visualization must come from the actual research. AI-generated diagrams used for conceptual illustrations (e.g., a schematic of a hypothesized mechanism) require explicit disclosure and a statement that the diagram is conceptual.
Rule 4: Disclose AI use in peer review participation
Reviewers writing reports for ES&T cannot use generative AI to draft their reports without disclosing it to the editor. Some ACS journals prohibit AI-assisted reviewing entirely; ES&T follows ACS's default of disclosure-required. The editor decides whether the report is acceptable based on disclosure.
How does Environmental Science & Technology (ACS)'s AI policy compare to peer journals?
Rule | ES&T stance | ACS default | ICMJE/COPE alignment |
|---|---|---|---|
AI authorship | Prohibited | Prohibited | ICMJE-aligned |
Disclosure location | Methods section | Methods section | ICMJE-aligned |
AI-generated figures | Prohibited for original data | Prohibited | COPE image-integrity-aligned |
Reviewer AI use | Disclosure required | Disclosure required | COPE peer-review-aligned |
Enforcement intensity | Desk-screen check | Desk-screen check | Pre-publication enforcement |
Source: https://pubs.acs.org/page/policy/ai_policy.html (accessed 2026-05-08) plus ES&T author guidelines.
What does AI disclosure look like in a ES&T Methods section?
Acceptable disclosure language for ES&T submissions:
"For our environmental science research-focused manuscript at ES&T, we used ChatGPT-4o (OpenAI, version dated October 2024) to polish English-language phrasing in the Introduction and Discussion sections. We did not use generative AI for data analysis, figure generation, or substantive manuscript content. All authors reviewed and edited the AI-assisted text and take responsibility for the final manuscript."
Or, for AI-assisted code:
"For this ES&T submission addressing environmental science research, initial Python code for the Bayesian regression analysis was drafted with Claude 3.5 Sonnet (Anthropic, version dated December 2024). All code was reviewed, modified, and validated by the authors before use; the final version is available at [repository URL]. Statistical inference was performed using the established R package brms."
What does NOT pass ES&T's desk-screen:
- For ES&T addressing environmental science research: "AI tools were used in manuscript preparation." Too vague for ACS editorial review of ES&T submissions; the ES&T editorial team needs the specific tool name, version, and specific use case
- "We acknowledge AI assistance in the Acknowledgments." (Wrong location; must be Methods)
- "ChatGPT helped write this paper." (Insufficient detail on use case)
- No disclosure when AI was used (publication-ethics violation)
What do pre-submission reviews reveal about ES&T's AI-disclosure desk-screen failures?
In our pre-submission review work on ES&T-targeted manuscripts, three patterns most consistently predict AI-policy desk-screen flags at Environmental Science & Technology (ACS). Of the manuscripts we screened in 2025 targeting ES&T and peer venues, the patterns below are the same ones ACS Publications AI Committee flags during editorial review.
AI disclosure missing despite obvious AI-assisted phrasing. ES&T editors identify AI-drafted text by patterns like overuse of em-dashes, formulaic transitions ("In conclusion," "Furthermore"), and uniform sentence length variance. When the manuscript shows these patterns but contains no AI disclosure, it triggers an editorial query. Check whether your manuscript reads as AI-assisted
AI disclosure in Acknowledgments instead of Methods. ES&T editorial team flags this as a common mistake against environmental science research submissions. ACS's policy specifies Methods placement so that the disclosure is part of the methodological record, not a courtesy under ES&T's editorial culture. Misplaced disclosures get flagged at desk-screen and require resubmission. Check whether your AI disclosure is in the right section
Generic disclosure language without tool name and version. ES&T editorial team requires the specific tool, its version (or access date), and the specific use case. "AI tools were used" without specifics gets returned. Check whether your AI disclosure has the required specificity
What is the ES&T AI-policy compliance timeline?
Stage | Duration | What happens |
|---|---|---|
Author drafts AI disclosure | 30-60 minutes | Identify all AI use, gather tool versions, write Methods paragraph |
Co-author review of disclosure | 1-2 days | All authors confirm the disclosure is complete and accurate |
Editorial desk-screen check | 1-2 weeks | ES&T's editorial team verifies disclosure against the manuscript |
Editorial query (if disclosure incomplete) | 5-10 days | Editor requests revision before sending to peer review |
Reviewer AI-disclosure check | During peer review | Reviewers verify the disclosure matches the manuscript style |
Source: Manusights internal review of ES&T-targeted submissions, 2025 cohort.
Submit If
- For Environmental Science & Technology (ACS) submissions on environmental science research: the manuscript explicitly discloses every AI tool used, with name, version, and specific use case in the Methods section, calibrated to ES&T's editorial expectations
- For ES&T: no AI tool is listed as an author; all listed authors meet ICMJE authorship criteria, agree to take responsibility, and ACS expects this acknowledgment in the cover letter
- For Environmental Science & Technology (ACS): figures and schematics representing original research data come from the actual research, not AI generation, with ES&T editorial team checking image-integrity at desk-screen
- For ES&T submissions: the disclosure includes a statement that all human authors reviewed and edited the AI-assisted text, with ACS requiring this acknowledgment per ICMJE + COPE
Readiness check
Run the scan while the topic is in front of you.
See score, top issues, and journal-fit signals before you submit.
Think Twice If
- The manuscript shows AI-drafted text patterns (em-dash overuse, formulaic transitions) but contains no AI disclosure; ES&T desk-screen will flag this.
- The AI disclosure is in the Acknowledgments instead of the Methods section, against ACS's explicit guidance.
- The disclosure language is generic ("AI tools were used") without specifying tool name, version, and use case; ES&T editors return manuscripts with this gap.
- Any figure or schematic representing original research data was generated by AI; ES&T prohibits this regardless of disclosure.
Manusights submission-corpus signal for Environmental Science & Technology (ACS). Of the manuscripts our team screened before submission to ES&T and peer venues in 2025, the AI-policy compliance gap most consistent across the cohort is generic disclosure language without tool-version specificity. In our analysis of anonymized ES&T-targeted submissions, manuscripts with complete AI disclosure (tool name, version, specific use case, all-author confirmation) clear desk-screen at the same rate as manuscripts without AI use; manuscripts with incomplete or missing disclosure trigger editorial queries that add 1-2 weeks to the timeline. ACS Publications AI Committee reviews disclosures against ICMJE + COPE framework requirements, and Environmental Science & Technology (ACS) applies that framework consistently with ACS's broader policy. Recent retractions in the ES&T corpus include 10.1021/acs.est.2c05143, 10.1021/acs.est.1c08087, and 10.1021/acs.est.3c01156. Citing any of these without acknowledging the retraction is an automatic publication-ethics flag, separate from AI-disclosure issues.
What can ES&T authors do to stay ahead of AI policy changes?
ACS's AI policy framework continues to evolve as 2026 brings new ICMJE recommendations, COPE guidance refinements, and journal-specific clarifications. ES&T authors targeting environmental science research submissions should track three signals throughout 2026:
Quarterly policy updates from ACS. ACS Publications AI Committee reviews the AI framework on a rolling basis. ES&T authors who pre-register their disclosure language at submission time tend to face fewer revisions during the 2026 transition period than authors who write boilerplate disclosures.
Field-specific clarifications for environmental science research. Different research domains see different AI use patterns. ES&T's editorial team has been refining what counts as "substantive AI use" versus "ancillary AI assistance" for environmental science research work. Authors who err on the side of more disclosure rather than less avoid the publication-ethics gray zone.
Reviewer disclosure norms. As ACS extends AI-disclosure rules to peer reviewers, the response rate from ES&T reviewers may shift. Authors should expect that ES&T reviewers' use of AI tools is now also disclosed and factored into editorial decisions.
- Manusights internal preview corpus (2025 cohort)
Frequently asked questions
Yes, with mandatory disclosure. Environmental Science & Technology (ACS) follows ACS's AI policy under the ICMJE + COPE framework. AI tools can be used for language editing, manuscript preparation, and analysis support, but all use must be disclosed in the Methods section. AI cannot be listed as an author, and human authors bear full responsibility for the content.
In the Methods section. Authors must name the specific AI tool (e.g., ChatGPT-4o, Claude 3.5 Sonnet), its version, and describe how it was used. The disclosure should confirm that all human authors reviewed and take responsibility for the AI-assisted content. ES&T's editorial team checks this disclosure during desk-screen.
No. Environmental Science & Technology (ACS) prohibits AI-generated figures, schematics, and images intended to represent original research data. AI tools may assist with figure layout and labeling, but the underlying data and visualizations must come from the actual research. This rule is part of ACS's broader image-integrity policy.
ES&T treats undisclosed AI use as a publication-ethics violation following COPE guidelines. Consequences range from required correction to expression of concern or retraction, depending on severity. ACS may notify the authors' institution in serious cases.
The core requirements (disclosure in Methods, no AI authorship, no AI-generated figures) are consistent across ACS-published journals. ES&T applies these rules consistently with ACS's broader policy framework. The journal-specific element is enforcement intensity at desk-screen, which at ES&T is calibrated by es&t reviewers expect both quantified environmental-data and explicit policy or treatment-technology relevance.
Sources
- ACS AI policy (accessed 2026-05-08)
- ES&T author guidelines (accessed 2026-05-08)
- ICMJE recommendations on AI use (accessed 2026-05-08)
- COPE guidance on AI in research publication (accessed 2026-05-08)
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Same journal, next question
- Environmental Science and Technology Submission Guide
- How to Avoid Desk Rejection at Environmental Science & Technology (2026)
- Is Environmental Science & Technology a Good Journal? Fit Verdict
- Environmental Science Technology Pre Submission Checklist: 12 Items Editors Verify Before Peer Review
- Environmental Science & Technology Submission Process: Submission Guide
- Environmental Science & Technology vs Science of the Total Environment
Supporting reads
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.