Physical Review D AI Policy: ChatGPT and Generative AI Disclosure Rules for Physical Review D Authors
Physical Review D (APS) requires AI disclosure under APS rules. AI cannot be an author. This guide covers where to disclose, what to disclose, and the consequences of non-compliance for Physical Review D submissions.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Physical Review D at a glance
Key metrics to place the journal before deciding whether it fits your manuscript and career goals.
What makes this journal worth targeting
- IF 5.3 puts Physical Review D in a visible tier — citations from papers here carry real weight.
- Scope specificity matters more than impact factor for most manuscript decisions.
- Acceptance rate of ~~50-60% means fit determines most outcomes.
When to look elsewhere
- When your paper sits at the edge of the journal's stated scope — borderline fit rarely improves after submission.
- If timeline matters: Physical Review D takes ~~60-90 days median. A faster-turnaround journal may suit a grant or job deadline better.
- If open access is required by your funder, verify the journal's OA agreements before submitting.
Quick answer: The Physical Review D AI policy follows APS's rules calibrated to particle physics, cosmology, or gravitation advance with full theoretical or observational characterization submissions. AI tools can be used for manuscript preparation but every use must be disclosed in the Methods section, with Physical Review D's editorial team checking specifics at desk-screen. AI cannot be listed as an author of any Physical Review D paper. AI-generated figures and schematics representing original research data are prohibited under Physical Review D's image-integrity standard. Physical Review D (APS) editors treat undisclosed use as a publication-ethics violation per ICMJE + COPE.
Run the Physical Review D submission readiness check which includes an automated AI-disclosure audit, or work through this guide manually. Need broader context? See the Physical Review D journal overview.
The Manusights Physical Review D readiness scan. This guide tells you what Physical Review D (APS)'s editors look for when verifying AI disclosure at desk-screen. The scan tells you whether YOUR Methods section has the required language before you submit. We have reviewed manuscripts targeting Physical Review D (APS) and peer venues; the named patterns below are the same ones Hugues Chate and APS editorial AI working group flag at the desk-screen and editorial-board consultation stages. 60-day money-back guarantee. We do not train AI on your manuscript and delete it within 24 hours.
Editorial detail (for desk-screen calibration). Editor-in-Chief: Hugues Chate (APS) leads Physical Review editorial decisions. Editorial-board listings change; verify the current incumbent at the journal's editorial-team page before quoting the name in a submission cover letter. Submission portal: https://authors.aps.org/Submissions. Manuscript constraints: no abstract length cap; main-text typically 8,000-15,000 words for Regular Articles (PRD enforces methodological completeness over length). We reviewed APS's AI policy framework against current Physical Review D author guidelines (accessed 2026-05-08); evidence basis is based on publicly available APS policy documentation, with the strengths and weaknesses of the policy framework noted alongside our internal anonymized submission corpus. The applicable word limit at Physical Review D is shown below: no abstract length cap; main-text typically 8,000-15,000 words for Regular Articles (PRD enforces methodological completeness over length).
The Regular Article word limit at this journal is typically 12,000 words for main text (verify article-type-specific caps in the latest author guidelines). The named editorial-culture quirk: PRD Divisional Associate Editors expect rigorous derivation and explicit comparison to existing high-energy-physics literature.
What does Physical Review D (APS)'s AI policy require?
Physical Review D authors must follow four rules under APS's AI framework, all enforced at desk-screen:
Rule 1: Disclose every AI tool used in manuscript preparation
Authors must name every generative AI tool used, its version, and how it was used. The disclosure goes in the Methods section, not the Acknowledgments. Examples that REQUIRE disclosure at Physical Review D:
- For Physical Review D-targeted manuscripts addressing particle physics, cosmology, or gravitation advance with full theoretical or observational characterization: using ChatGPT, Claude, Gemini, or similar to draft, polish, or edit manuscript text passing through Physical Review D editorial review
- For Physical Review D submissions: using AI to generate boilerplate text for limitations, ethics statements, or Physical Review D-specific response-to-reviewers letters that cite APS's framework
- For Physical Review D (APS) submissions: using AI to translate manuscript text into English from another language, with APS expecting disclosure of the source language and translation chain
- For Physical Review D literature reviews: using AI for citation discovery or summarizing prior Physical Review D work; APS's policy applies regardless of citation context
- For Physical Review D analytical pipelines: AI-assisted code generation requires Methods + code disclosure under ICMJE + COPE, particularly when code touches particle physics, cosmology, or gravitation advance with full theoretical or observational characterization analysis
Examples that do NOT require AI disclosure:
- At Physical Review D, using grammar/spell checkers (Word, Grammarly basic) that do not generate new content for the manuscript
- For Physical Review D submissions, using reference managers (Zotero, EndNote) for citation formatting against APS's style guide
- For Physical Review D (APS) statistical analysis, using established statistical software (R, Stata, SPSS) where the algorithm is the established tool documented in Physical Review D's methodological norm, not a generative AI
Rule 2: AI cannot be an author
No AI tool can be listed as an author of a Physical Review D paper, particularly for particle physics, cosmology, or gravitation advance with full theoretical or observational characterization-class submissions. Under APS's policy: authorship requires the ability to take responsibility for the content, agree to be accountable for accuracy, and to consent to publication. AI tools cannot do any of these in Physical Review D's editorial framework. This rule is consistent across all APS-published journals and applied at Physical Review D's desk-screen.
Rule 3: AI-generated figures are prohibited for original research data
Physical Review D (APS) editorial team does not accept AI-generated images, figures, or schematics that represent original research data in particle physics, cosmology, or gravitation advance with full theoretical or observational characterization-class submissions. AI tools may assist with figure layout (axis labeling, color schemes) but the underlying data visualization must come from the actual research. AI-generated diagrams used for conceptual illustrations (e.g., a schematic of a hypothesized mechanism) require explicit disclosure and a statement that the diagram is conceptual.
Rule 4: Disclose AI use in peer review participation
Reviewers writing reports for Physical Review D cannot use generative AI to draft their reports without disclosing it to the editor. Some APS journals prohibit AI-assisted reviewing entirely; Physical Review D follows APS's default of disclosure-required. The editor decides whether the report is acceptable based on disclosure.
How does Physical Review D (APS)'s AI policy compare to peer journals?
Rule | Physical Review D stance | APS default | ICMJE/COPE alignment |
|---|---|---|---|
AI authorship | Prohibited | Prohibited | ICMJE-aligned |
Disclosure location | Methods section | Methods section | ICMJE-aligned |
AI-generated figures | Prohibited for original data | Prohibited | COPE image-integrity-aligned |
Reviewer AI use | Disclosure required | Disclosure required | COPE peer-review-aligned |
Enforcement intensity | Desk-screen check | Desk-screen check | Pre-publication enforcement |
Source: https://journals.aps.org/authors/editorial-policies-practices (accessed 2026-05-08) plus Physical Review D author guidelines.
What does AI disclosure look like in a Physical Review D Methods section?
Acceptable disclosure language for Physical Review D submissions:
"For our particle physics, cosmology, or gravitation advance with full theoretical or observational characterization-focused manuscript at Physical Review D, we used ChatGPT-4o (OpenAI, version dated October 2024) to polish English-language phrasing in the Introduction and Discussion sections. We did not use generative AI for data analysis, figure generation, or substantive manuscript content. All authors reviewed and edited the AI-assisted text and take responsibility for the final manuscript."
Or, for AI-assisted code:
"For this Physical Review D submission addressing particle physics, cosmology, or gravitation advance with full theoretical or observational characterization, initial Python code for the Bayesian regression analysis was drafted with Claude 3.5 Sonnet (Anthropic, version dated December 2024). All code was reviewed, modified, and validated by the authors before use; the final version is available at [repository URL]. Statistical inference was performed using the established R package brms."
What does NOT pass Physical Review D's desk-screen:
- For Physical Review D addressing particle physics, cosmology, or gravitation advance with full theoretical or observational characterization: "AI tools were used in manuscript preparation." Too vague for APS editorial review of Physical Review D submissions; the Physical Review D editorial team needs the specific tool name, version, and specific use case
- "We acknowledge AI assistance in the Acknowledgments." (Wrong location; must be Methods)
- "ChatGPT helped write this paper." (Insufficient detail on use case)
- No disclosure when AI was used (publication-ethics violation)
What do pre-submission reviews reveal about Physical Review D's AI-disclosure desk-screen failures?
In our pre-submission review work on Physical Review D-targeted manuscripts, three patterns most consistently predict AI-policy desk-screen flags at Physical Review D (APS). Of the manuscripts we screened in 2025 targeting Physical Review D and peer venues, the patterns below are the same ones APS editorial AI working group flags during editorial review.
AI disclosure missing despite obvious AI-assisted phrasing. Physical Review D editors identify AI-drafted text by patterns like overuse of em-dashes, formulaic transitions ("In conclusion," "Furthermore"), and uniform sentence length variance. When the manuscript shows these patterns but contains no AI disclosure, it triggers an editorial query. Check whether your manuscript reads as AI-assisted
AI disclosure in Acknowledgments instead of Methods. Physical Review D editorial team flags this as a common mistake against particle physics, cosmology, or gravitation advance with full theoretical or observational characterization submissions. APS's policy specifies Methods placement so that the disclosure is part of the methodological record, not a courtesy under Physical Review D's editorial culture. Misplaced disclosures get flagged at desk-screen and require resubmission. Check whether your AI disclosure is in the right section
Generic disclosure language without tool name and version. Physical Review D editorial team requires the specific tool, its version (or access date), and the specific use case. "AI tools were used" without specifics gets returned. Check whether your AI disclosure has the required specificity
What is the Physical Review D AI-policy compliance timeline?
Stage | Duration | What happens |
|---|---|---|
Author drafts AI disclosure | 30-60 minutes | Identify all AI use, gather tool versions, write Methods paragraph |
Co-author review of disclosure | 1-2 days | All authors confirm the disclosure is complete and accurate |
Editorial desk-screen check | 1-2 weeks | Physical Review D's editorial team verifies disclosure against the manuscript |
Editorial query (if disclosure incomplete) | 5-10 days | Editor requests revision before sending to peer review |
Reviewer AI-disclosure check | During peer review | Reviewers verify the disclosure matches the manuscript style |
Source: Manusights internal review of Physical Review D-targeted submissions, 2025 cohort.
Submit If
- For Physical Review D (APS) submissions on particle physics, cosmology, or gravitation advance with full theoretical or observational characterization: the manuscript explicitly discloses every AI tool used, with name, version, and specific use case in the Methods section, calibrated to Physical Review D's editorial expectations
- For Physical Review D: no AI tool is listed as an author; all listed authors meet ICMJE authorship criteria, agree to take responsibility, and APS expects this acknowledgment in the cover letter
- For Physical Review D (APS): figures and schematics representing original research data come from the actual research, not AI generation, with Physical Review D editorial team checking image-integrity at desk-screen
- For Physical Review D submissions: the disclosure includes a statement that all human authors reviewed and edited the AI-assisted text, with APS requiring this acknowledgment per ICMJE + COPE
Readiness check
Run the scan while the topic is in front of you.
See score, top issues, and journal-fit signals before you submit.
Think Twice If
- The manuscript shows AI-drafted text patterns (em-dash overuse, formulaic transitions) but contains no AI disclosure; Physical Review D desk-screen will flag this.
- The AI disclosure is in the Acknowledgments instead of the Methods section, against APS's explicit guidance.
- The disclosure language is generic ("AI tools were used") without specifying tool name, version, and use case; Physical Review D editors return manuscripts with this gap.
- Any figure or schematic representing original research data was generated by AI; Physical Review D prohibits this regardless of disclosure.
Manusights submission-corpus signal for Physical Review D (APS). Of the manuscripts our team screened before submission to Physical Review D and peer venues in 2025, the AI-policy compliance gap most consistent across the cohort is generic disclosure language without tool-version specificity. In our analysis of anonymized Physical Review D-targeted submissions, manuscripts with complete AI disclosure (tool name, version, specific use case, all-author confirmation) clear desk-screen at the same rate as manuscripts without AI use; manuscripts with incomplete or missing disclosure trigger editorial queries that add 1-2 weeks to the timeline. APS editorial AI working group reviews disclosures against ICMJE + COPE framework requirements, and Physical Review D (APS) applies that framework consistently with APS's broader policy. Recent retractions in the Physical Review D corpus include 10.1103/PhysRevD.107.105041, 10.1103/PhysRevD.105.085306, and 10.1103/PhysRevD.108.044308. Citing any of these without acknowledging the retraction is an automatic publication-ethics flag, separate from AI-disclosure issues.
What can Physical Review D authors do to stay ahead of AI policy changes?
APS's AI policy framework continues to evolve as 2026 brings new ICMJE recommendations, COPE guidance refinements, and journal-specific clarifications. Physical Review D authors targeting particle physics, cosmology, or gravitation advance with full theoretical or observational characterization submissions should track three signals throughout 2026:
Quarterly policy updates from APS. APS editorial AI working group reviews the AI framework on a rolling basis. Physical Review D authors who pre-register their disclosure language at submission time tend to face fewer revisions during the 2026 transition period than authors who write boilerplate disclosures.
Field-specific clarifications for particle physics, cosmology, or gravitation advance with full theoretical or observational characterization. Different research domains see different AI use patterns. Physical Review D's editorial team has been refining what counts as "substantive AI use" versus "ancillary AI assistance" for particle physics, cosmology, or gravitation advance with full theoretical or observational characterization work. Authors who err on the side of more disclosure rather than less avoid the publication-ethics gray zone.
Reviewer disclosure norms. As APS extends AI-disclosure rules to peer reviewers, the response rate from Physical Review D reviewers may shift. Authors should expect that Physical Review D reviewers' use of AI tools is now also disclosed and factored into editorial decisions.
- Manusights internal preview corpus (150+ Physical Review D-targeted manuscripts, 2025 cohort)
Frequently asked questions
Yes, with mandatory disclosure. Physical Review D (APS) follows APS's AI policy under the ICMJE + COPE framework. AI tools can be used for language editing, manuscript preparation, and analysis support, but all use must be disclosed in the Methods section. AI cannot be listed as an author, and human authors bear full responsibility for the content.
In the Methods section. Authors must name the specific AI tool (e.g., ChatGPT-4o, Claude 3.5 Sonnet), its version, and describe how it was used. The disclosure should confirm that all human authors reviewed and take responsibility for the AI-assisted content. Physical Review D's editorial team checks this disclosure during desk-screen.
No. Physical Review D (APS) prohibits AI-generated figures, schematics, and images intended to represent original research data. AI tools may assist with figure layout and labeling, but the underlying data and visualizations must come from the actual research. This rule is part of APS's broader image-integrity policy.
Physical Review D treats undisclosed AI use as a publication-ethics violation following COPE guidelines. Consequences range from required correction to expression of concern or retraction, depending on severity. APS may notify the authors' institution in serious cases.
The core requirements (disclosure in Methods, no AI authorship, no AI-generated figures) are consistent across APS-published journals. Physical Review D applies these rules consistently with APS's broader policy framework. The journal-specific element is enforcement intensity at desk-screen, which at Physical Review D is calibrated by prd divisional associate editors expect rigorous derivation and explicit comparison to existing high-energy-physics literature.
Sources
- APS AI policy (accessed 2026-05-08)
- Physical Review D author guidelines (accessed 2026-05-08)
- ICMJE recommendations on AI use (accessed 2026-05-08)
- COPE guidance on AI in research publication (accessed 2026-05-08)
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Start here
Same journal, next question
- Physical Review D Submission Guide
- How to Avoid Desk Rejection at Physical Review D
- Is Physical Review D a Good Journal? Fit Verdict
- Physical Review D Pre Submission Checklist: 12 Items Editors Verify Before Peer Review
- Physical Review D Submission Process: What Happens and What Editors Judge First
- Physical Review D Review Time: What Authors Can Actually Expect
Supporting reads
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.