Publishing Strategy8 min readUpdated Mar 25, 2026

Nature Methods' AI Policy: When Your Methods Journal Needs a Methods Disclosure

Nature Methods follows Springer Nature's AI policy with Methods disclosure required, with unique considerations for papers describing AI methods, benchmarking studies, and code availability.

Senior Researcher, Oncology & Cell Biology

Author context

Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.

Next step

Choose the next useful decision step first.

Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.

Open Journal Fit ChecklistAnthropic Privacy Partner. Zero-retention manuscript processing.Run Free Readiness Scan

There's a certain irony in writing about AI disclosure policy for a journal whose entire identity is publishing new methods, including AI methods. Nature Methods has published some of the most-cited papers in computational biology and machine learning for science: tools that thousands of researchers use daily. When its own authors use ChatGPT to polish their prose, the disclosure requirement creates a layer of meta-commentary that doesn't exist at most journals. But the rules are straightforward, even if the context is unusual.

The standard policy

Nature Methods follows the Springer Nature AI policy without modification. Every rule that applies to Nature, Nature Medicine, Nature Biotechnology, or Scientific Reports applies here:

  1. AI can't be an author. No exceptions, even if your paper is literally about building a better language model.
  2. Disclosure goes in Methods. Describe which AI tool you used for manuscript preparation, how you used it, and which sections it touched.
  3. AI-generated images are banned. Generative AI tools can't produce figures, graphical abstracts, or visual content.
  4. Copy editing is exempt. Standard grammar tools like Grammarly don't need disclosure.
  5. Authors bear full responsibility for all content, including AI-assisted sections.

The policy covers 3,000+ Springer Nature journals. Nature Methods isn't special in terms of the rules, it's special in terms of the audience.

Why Nature Methods is different in practice

Nature Methods publishes three types of content that create unique AI disclosure dynamics:

Papers about AI/ML methods

If you've developed a new deep learning architecture for protein folding, a foundation model for single-cell analysis, or an NLP tool for mining biomedical text, your paper's subject is AI. The disclosure policy doesn't cover this, your AI method is scientific methodology described in the standard Methods section.

What the policy does cover: if you also used ChatGPT to edit your writing, Copilot to help with documentation code, or Claude to restructure your Results section. That's manuscript preparation, and it gets its own disclosure.

The confusion arises when authors write something like: "AI was used throughout this work." At Nature Methods, this could mean the AI method they developed, the AI they used to write code, or the AI they used to polish text. Reviewers need to know which is which.

Papers about non-AI methods that use AI in analysis

Your paper might describe a new microscopy technique, a novel assay, or a biochemical protocol. But you used machine learning to process the imaging data or AI to generate analysis code. Here, the AI use is part of your research pipeline but isn't the method being published. This should be described in your Methods as a data analysis approach, with any manuscript preparation AI use disclosed separately.

Benchmarking papers

Nature Methods publishes benchmarking studies that compare multiple computational tools. If you used AI to help write benchmarking scripts, select parameters, or generate comparison tables, this needs disclosure. Benchmarking integrity is central to the journal's value, reviewers will scrutinize whether AI involvement in the benchmarking process could have introduced bias.

Writing the disclosure for Nature Methods

For a paper describing a new AI method:

"DeepCellSeg, the method described in this paper, was developed using PyTorch 2.0 and trained on curated single-cell imaging datasets (see Methods: Model Architecture and Training). Separately, during manuscript preparation, the authors used ChatGPT (GPT-4, OpenAI) to improve the clarity of the Introduction and Discussion sections. All AI-edited text was reviewed by the corresponding author. The authors take full responsibility for the published content."

Key elements: The word "Separately" is doing important work here. It signals to reviewers that you're distinguishing between the research method and the writing tool.

For a paper describing a wet-lab method:

"During preparation of this manuscript, the authors used Claude (Claude 3.5, Anthropic) to assist with editing the Protocol section for language clarity and to help draft the Troubleshooting table. All content was verified against the experimental results by the senior author (R.T.). GitHub Copilot (Microsoft) was used to write data processing scripts in R; all scripts were validated against manually processed datasets."

For a benchmarking study:

"The benchmarking framework and evaluation scripts were written by the authors without AI assistance. During manuscript preparation, the authors used ChatGPT (GPT-4, OpenAI) to improve the readability of the Results section. The selection of methods for benchmarking, the parameter choices, and the performance metrics were determined by the authors based on the criteria described in Methods."

Why explicitly state that the benchmarking was done without AI? Because at Nature Methods, readers and reviewers care deeply about benchmarking independence. If AI selected which tools to compare or which parameters to use, that's a potential bias that needs to be visible.

Code and data availability

Nature Methods has among the strongest code availability requirements in scientific publishing. For computational papers, the journal expects:

  • Source code in a public repository (GitHub, GitLab, Zenodo)
  • Documentation sufficient for other researchers to reproduce your results
  • Test datasets or instructions for generating them
  • Environment specifications (Docker containers, conda files)

This intersects with AI disclosure in an important way. If you used AI to generate portions of your code:

  1. The code should still be available. AI-generated code isn't exempt from the open-source requirement.
  2. Consider marking AI-generated sections. You're not required to do this, but for a methods paper where reproducibility is the entire point, clearly documented code, including which portions were AI-assisted, strengthens your paper.
  3. The AI-generated code must work independently. Your method's reproducibility can't depend on access to ChatGPT or Copilot. If someone clones your repo in five years, the code should run without AI assistance.

What requires disclosure

Use case
Disclosure required?
Nature Methods-specific notes
Standard grammar tools
No
Grammarly, Word spell check exempt
ChatGPT for language editing
Yes
Methods section disclosure
AI for code generation
Yes
Specify what code, confirm validation
AI for benchmarking scripts
Yes
Clarify AI didn't influence method selection
AI for figure aesthetics
Yes (if generative)
Computational plots from data are fine
AI for protocol writing
Yes
Verify against experimental reality
AI for documentation/README
Gray area
Disclose if substantial
AI to debug existing code
Yes
Minor debugging may not need disclosure; major rewrites do
AI for tutorial/vignette writing
Yes
Common in methods papers; disclose

The tutorial and vignette point is specific to Nature Methods. Methods papers often include tutorials showing users how to run the software. If AI helped write these, disclose it, they're part of the manuscript.

Consequences of non-disclosure

Standard Springer Nature enforcement applies:

During review: Editor requests disclosure addition. Nature Methods reviewers are computational experts who may notice AI-generated patterns in code or text more readily than reviewers at clinical journals.

After publication:

  • Correction for minor undisclosed language editing
  • Expression of concern if AI use affected benchmarking or methodology descriptions
  • Retraction if AI generated fabricated benchmarking results or false performance claims

The reproducibility angle: For Nature Methods, the most damaging scenario isn't AI-polished prose, it's AI involvement in code or benchmarking that affects reproducibility. If a published method can't be reproduced because the code was AI-generated and doesn't work independently, that's a more serious issue than a stylistic non-disclosure. The journal's reputation is built on methods that work.

Comparison with other methods-focused journals

Feature
Nature Methods
Bioinformatics
Genome Biology
Nucleic Acids Research
PLOS Computational Biology
Publisher
Springer Nature
Oxford UP
Springer Nature
Oxford UP
PLOS
AI authorship
Prohibited
Prohibited
Prohibited
Prohibited
Prohibited
Disclosure location
Methods
Methods
Methods
Methods
Methods
AI image ban
Yes
Yes
Yes
Yes
Yes
Code availability
Required
Required
Required
Required
Required
AI/ML as research subject
Very common
Very common
Common
Very common
Common

All five journals require code availability and follow similar AI disclosure policies. The main difference is editorial culture: Nature Methods applies the most rigorous peer review to methods claims (typically 3+ reviewers including domain experts and methodologists), which means AI-related disclosure issues are more likely to be caught during review.

Practical advice for Nature Methods submissions

For AI/ML method papers:

  • Use separate Methods subsections: one for your research method's architecture and training, another for manuscript preparation AI disclosure
  • If your method uses an LLM as a component, be extremely clear about what the LLM does in your research pipeline vs. what LLMs did for your writing
  • Include ablation studies and performance comparisons that were conducted without AI assistance (or disclose if AI helped)

For wet-lab/experimental methods papers:

  • If AI helped write the protocol, verify every step against your actual experimental workflow. AI-generated protocols sound reasonable but may include steps that don't reflect your specific setup.
  • For troubleshooting tables, a common Nature Methods feature, don't generate these entirely with AI. They should reflect genuine problems you encountered in the lab.

For all submission types:

  • Make your code repository clean and well-documented before submission. Nature Methods reviewers will look at your code.
  • If you used AI to write tests for your code, disclose this and ensure the tests actually validate meaningful functionality, not just surface-level checks.
  • Don't use AI to generate synthetic benchmarking data. Nature Methods' statistical reviewers will check whether your benchmarks used real or synthetic data.

Before submission checklist:

  • [ ] Research AI and writing AI clearly separated in Methods
  • [ ] Disclosure includes tool name, version, and use case
  • [ ] No generative AI images
  • [ ] Code deposited in public repository with documentation
  • [ ] AI-generated code sections validated independently
  • [ ] Benchmarking conducted without AI bias
  • [ ] Co-authors reviewed and approved AI disclosure

A free manuscript assessment can help you verify your Nature Methods submission meets editorial and disclosure standards before submission.

References

Sources

  1. Springer Nature AI policy
  2. Nature Methods author guidelines
  3. Nature Methods code and data availability policy
  4. Nature editorial: Tools such as ChatGPT threaten transparent science
  5. COPE position statement on AI

Reference library

Use the core publishing datasets alongside this guide

This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.

Open the reference library

Before you upload

Choose the next useful decision step first.

Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.

Use the scan once the manuscript and target journal are concrete enough to evaluate.

Anthropic Privacy Partner. Zero-retention manuscript processing.

Internal navigation

Where to go next

Open Journal Fit Checklist