Nature Methods' AI Policy: When Your Methods Journal Needs a Methods Disclosure
Nature Methods follows Springer Nature's AI policy with Methods disclosure required, with unique considerations for papers describing AI methods, benchmarking studies, and code availability.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Nature Methods at a glance
Key metrics to place the journal before deciding whether it fits your manuscript and career goals.
What makes this journal worth targeting
- IF 32.1 puts Nature Methods in a visible tier — citations from papers here carry real weight.
- Scope specificity matters more than impact factor for most manuscript decisions.
- Acceptance rate of ~~8-10% means fit determines most outcomes.
When to look elsewhere
- When your paper sits at the edge of the journal's stated scope — borderline fit rarely improves after submission.
- If timeline matters: Nature Methods takes ~7 day. A faster-turnaround journal may suit a grant or job deadline better.
- If OA is required: gold OA costs $12,690. Check institutional agreements before submitting.
Quick answer: There's a certain irony in writing about AI disclosure policy for a journal whose entire identity is publishing new methods, including AI methods. Nature Methods has published some of the most-cited papers in computational biology and machine learning for science: tools that thousands of researchers use daily.
Nature Methods AI Policy at a Glance
- AI authorship: Prohibited. AI tools cannot be listed as authors and cannot take accountability for the work.
- AI disclosure: Required. Disclose use of AI tools (e.g., ChatGPT, Claude, Gemini) in the Methods section.
- AI-generated images: Prohibited. AI-created figures, illustrations, or visualizations are not permitted in the manuscript.
- Copy editing: Copy editing for grammar and language is exempt from disclosure.
The standard policy
Nature Methods follows the Springer Nature AI policy without modification. Every rule that applies to Nature, Nature Medicine, Nature Biotechnology, or Scientific Reports applies here:
- AI can't be an author. No exceptions, even if your paper is literally about building a better language model.
- Disclosure goes in Methods. Describe which AI tool you used for manuscript preparation, how you used it, and which sections it touched.
- AI-generated images are banned. Generative AI tools can't produce figures, graphical abstracts, or visual content.
- Copy editing is exempt. Standard grammar tools like Grammarly don't need disclosure.
- Authors bear full responsibility for all content, including AI-assisted sections.
The policy covers 3,000+ Springer Nature journals. Nature Methods isn't special in terms of the rules, it's special in terms of the audience.
Why Nature Methods is different in practice
Nature Methods publishes three types of content that create unique AI disclosure dynamics:
Papers about AI/ML methods
If you've developed a new deep learning architecture for protein folding, a foundation model for single-cell analysis, or an NLP tool for mining biomedical text, your paper's subject is AI. The disclosure policy doesn't cover this, your AI method is scientific methodology described in the standard Methods section.
What the policy does cover: if you also used ChatGPT to edit your writing, Copilot to help with documentation code, or Claude to restructure your Results section. That's manuscript preparation, and it gets its own disclosure.
The confusion arises when authors write something like: "AI was used throughout this work." At Nature Methods, this could mean the AI method they developed, the AI they used to write code, or the AI they used to polish text. Reviewers need to know which is which.
Papers about non-AI methods that use AI in analysis
Your paper might describe a new microscopy technique, a novel assay, or a biochemical protocol. But you used machine learning to process the imaging data or AI to generate analysis code. Here, the AI use is part of your research pipeline but isn't the method being published. This should be described in your Methods as a data analysis approach, with any manuscript preparation AI use disclosed separately.
Benchmarking papers
Nature Methods publishes benchmarking studies that compare multiple computational tools. If you used AI to help write benchmarking scripts, select parameters, or generate comparison tables, this needs disclosure. Benchmarking integrity is central to the journal's value, reviewers will scrutinize whether AI involvement in the benchmarking process could have introduced bias.
Writing the disclosure for Nature Methods
For a paper describing a new AI method:
"DeepCellSeg, the method described in this paper, was developed using PyTorch 2.0 and trained on curated single-cell imaging datasets (see Methods: Model Architecture and Training). Separately, during manuscript preparation, the authors used ChatGPT (GPT-4, OpenAI) to improve the clarity of the Introduction and Discussion sections. All AI-edited text was reviewed by the corresponding author. The authors take full responsibility for the published content."
Key elements: The word "Separately" is doing important work here. It signals to reviewers that you're distinguishing between the research method and the writing tool.
For a paper describing a wet-lab method:
"During preparation of this manuscript, the authors used Claude (Claude 3.5, Anthropic) to assist with editing the Protocol section for language clarity and to help draft the Troubleshooting table. All content was verified against the experimental results by the senior author (R.T.). GitHub Copilot (Microsoft) was used to write data processing scripts in R; all scripts were validated against manually processed datasets."
For a benchmarking study:
"The benchmarking framework and evaluation scripts were written by the authors without AI assistance. During manuscript preparation, the authors used ChatGPT (GPT-4, OpenAI) to improve the readability of the Results section. The selection of methods for benchmarking, the parameter choices, and the performance metrics were determined by the authors based on the criteria described in Methods."
Why explicitly state that the benchmarking was done without AI? Because at Nature Methods, readers and reviewers care deeply about benchmarking independence. If AI selected which tools to compare or which parameters to use, that's a potential bias that needs to be visible.
Code and data availability
Nature Methods has among the strongest code availability requirements in scientific publishing. For computational papers, the journal expects:
- Source code in a public repository (GitHub, GitLab, Zenodo)
- Documentation sufficient for other researchers to reproduce your results
- Test datasets or instructions for generating them
- Environment specifications (Docker containers, conda files)
This intersects with AI disclosure in an important way. If you used AI to generate portions of your code:
- The code should still be available. AI-generated code isn't exempt from the open-source requirement.
- Consider marking AI-generated sections. You're not required to do this, but for a methods paper where reproducibility is the entire point, clearly documented code, including which portions were AI-assisted, strengthens your paper.
- The AI-generated code must work independently. Your method's reproducibility can't depend on access to ChatGPT or Copilot. If someone clones your repo in five years, the code should run without AI assistance.
Readiness check
Run the scan while the topic is in front of you.
See score, top issues, and journal-fit signals before you submit.
What requires disclosure
Use case | Disclosure required? | Nature Methods-specific notes |
|---|---|---|
Standard grammar tools | No | Grammarly, Word spell check exempt |
ChatGPT for language editing | Yes | Methods section disclosure |
AI for code generation | Yes | Specify what code, confirm validation |
AI for benchmarking scripts | Yes | Clarify AI didn't influence method selection |
AI for figure aesthetics | Yes (if generative) | Computational plots from data are fine |
AI for protocol writing | Yes | Verify against experimental reality |
AI for documentation/README | Gray area | Disclose if substantial |
AI to debug existing code | Yes | Minor debugging may not need disclosure; major rewrites do |
AI for tutorial/vignette writing | Yes | Common in methods papers; disclose |
The tutorial and vignette point is specific to Nature Methods. Methods papers often include tutorials showing users how to run the software. If AI helped write these, disclose it, they're part of the manuscript.
Consequences of non-disclosure
Standard Springer Nature enforcement applies:
During review: Editor requests disclosure addition. Nature Methods reviewers are computational experts who may notice AI-generated patterns in code or text more readily than reviewers at clinical journals.
After publication:
- Correction for minor undisclosed language editing
- Expression of concern if AI use affected benchmarking or methodology descriptions
- Retraction if AI generated fabricated benchmarking results or false performance claims
The reproducibility angle: For Nature Methods, the most damaging scenario isn't AI-polished prose, it's AI involvement in code or benchmarking that affects reproducibility. If a published method can't be reproduced because the code was AI-generated and doesn't work independently, that's a more serious issue than a stylistic non-disclosure. The journal's reputation is built on methods that work.
Comparison with other methods-focused journals
Feature | Nature Methods | Bioinformatics | Genome Biology | Nucleic Acids Research | PLOS Computational Biology |
|---|---|---|---|---|---|
Publisher | Springer Nature | Oxford UP | Springer Nature | Oxford UP | PLOS |
AI authorship | Prohibited | Prohibited | Prohibited | Prohibited | Prohibited |
Disclosure location | Methods | Methods | Methods | Methods | Methods |
AI image ban | Yes | Yes | Yes | Yes | Yes |
Code availability | Required | Required | Required | Required | Required |
AI/ML as research subject | Very common | Very common | Common | Very common | Common |
All five journals require code availability and follow similar AI disclosure policies. The main difference is editorial culture: Nature Methods applies the most rigorous peer review to methods claims (typically 3+ reviewers including domain experts and methodologists), which means AI-related disclosure issues are more likely to be caught during review.
Practical advice for Nature Methods submissions
For AI/ML method papers:
- Use separate Methods subsections: one for your research method's architecture and training, another for manuscript preparation AI disclosure
- If your method uses an LLM as a component, be extremely clear about what the LLM does in your research pipeline vs. what LLMs did for your writing
- Include ablation studies and performance comparisons that were conducted without AI assistance (or disclose if AI helped)
For wet-lab/experimental methods papers:
- If AI helped write the protocol, verify every step against your actual experimental workflow. AI-generated protocols sound reasonable but may include steps that don't reflect your specific setup.
- For troubleshooting tables, a common Nature Methods feature, don't generate these entirely with AI. They should reflect genuine problems you encountered in the lab.
For all submission types:
- Make your code repository clean and well-documented before submission. Nature Methods reviewers will look at your code.
- If you used AI to write tests for your code, disclose this and ensure the tests actually validate meaningful functionality, not just surface-level checks.
- Don't use AI to generate synthetic benchmarking data. Nature Methods' statistical reviewers will check whether your benchmarks used real or synthetic data.
Before submission checklist:
- [ ] Research AI and writing AI clearly separated in Methods
- [ ] Disclosure includes tool name, version, and use case
- [ ] No generative AI images
- [ ] Code deposited in public repository with documentation
- [ ] AI-generated code sections validated independently
- [ ] Benchmarking conducted without AI bias
- [ ] Co-authors reviewed and approved AI disclosure
A Nature Methods submission readiness check can help you verify your Nature Methods submission meets editorial and disclosure standards before submission.
What should you do about Nature Methods''s AI policy?
Comply proactively if:
- You used any AI tool (ChatGPT, Grammarly, Copilot) during manuscript preparation
- The journal requires AI use disclosure in the methods or acknowledgments
- Your institution has its own AI use policy that may be stricter
Less concerned if:
- You used AI only for grammar/spell checking (most journals exempt this)
- The journal does not have a formal AI policy yet
- Your use was limited to literature search or reference management
Frequently asked questions
Yes, under Springer Nature's standard AI policy. Authors can use AI tools for language editing and manuscript preparation with mandatory disclosure in the Methods section. AI can't be listed as an author, and AI-generated images are prohibited.
Use clearly separated disclosures. Your research method (the AI tool you developed) belongs in the main Methods section as scientific methodology. AI tools used for writing the paper (ChatGPT, Claude, etc.) go in a separate paragraph or subsection labeled as manuscript preparation disclosure. Nature Methods reviewers are methodologists, they'll appreciate the clarity.
Yes. Nature Methods has strong code and data availability requirements. For papers describing computational methods, the journal expects source code to be deposited in a public repository (GitHub, Zenodo, etc.) with documentation sufficient for reproduction. This isn't part of the AI disclosure policy specifically, but it's closely related.
Yes, but disclose it if the AI tool played a meaningful role. If you used ChatGPT to write benchmarking scripts, that's a writing/code disclosure. If you used AI to select which methods to benchmark against, that should be described as part of your benchmarking methodology. The distinction matters because Nature Methods reviewers will scrutinize benchmarking choices carefully.
Standard Springer Nature consequences: correction, expression of concern, or retraction depending on severity. For a methods journal specifically, undisclosed AI in code or benchmarking could undermine the paper's reproducibility claims, which is the core value proposition of a Nature Methods publication.
Sources
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Start here
Same journal, next question
Supporting reads
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.