Cell Reports' AI Policy: Cell Press Rules for the Broad-Scope OA Journal
Cell Reports follows the Cell Press AI policy: disclosure goes in STAR Methods, AI cannot be an author, and the same rules apply across Cell Reports Medicine and all Cell Press titles.
Senior Researcher, Oncology & Cell Biology
Author context
Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Cell Reports is Cell Press's workhorse. Where Cell publishes roughly 600 research articles per year and selectivity is extreme, Cell Reports handles 2,000-2,500 articles across all areas of life sciences. It's fully open access, it has a broader acceptance rate, and it processes a higher volume of manuscripts than any other Cell Press primary research journal. The AI policy is identical to Cell's, but the scale at which it operates creates different practical dynamics that authors should understand.
The Cell Press AI policy at Cell Reports
Cell Reports inherits its AI policy from Cell Press without modification. The rules are the same as Cell, Cancer Cell, Molecular Cell, Immunity, and Neuron:
- AI can't be an author. Generative AI tools don't meet authorship criteria.
- AI use must be disclosed in STAR Methods. Specifically in the Method Details subsection.
- AI-generated images are prohibited. No generative AI figures, graphical abstracts, or illustrations.
- Authors are fully accountable. Every co-author takes responsibility for all content.
- All preparation phases count. AI use at any stage of writing requires disclosure.
Cell Press is part of Elsevier, so the policy also aligns with Elsevier's broader AI guidelines. But Cell Press's STAR Methods requirement adds formatting specificity beyond what general Elsevier journals mandate.
Cell Reports vs. Cell: same rules, different context
Understanding where Cell Reports sits in the Cell Press hierarchy clarifies why the same AI policy plays out differently:
Aspect | Cell | Cell Reports |
|---|---|---|
AI policy | Cell Press standard | Cell Press standard |
Articles/year | ~600 | ~2,000-2,500 |
Acceptance rate | ~8% | ~25-30% |
Access model | Subscription + OA option | Fully open access |
APC | N/A (subscription) or ~$9,900 (OA) | ~$4,530 |
Editorial scrutiny per paper | Very high | High but less per-paper time |
Peer review depth | 3+ reviewers typical | 2-3 reviewers typical |
Post-publication visibility | High (but paywalled for some) | Very high (fully OA) |
The practical implication: Cell Reports editors handle more papers with relatively fewer editorial hours per manuscript than Cell's editors. AI disclosure compliance depends more heavily on author self-reporting and reviewer attention at Cell Reports than at Cell itself.
This doesn't mean Cell Reports doesn't care about AI policy. It means that honest, thorough self-disclosure is even more important here because the editorial team can't apply the same per-manuscript scrutiny that a lower-volume journal can.
The Cell Reports family
Cell Press publishes several journals under the "Cell Reports" brand, each following the same AI policy:
Journal | Focus | Articles/year |
|---|---|---|
Cell Reports | Broad life sciences | 2,000-2,500 |
Cell Reports Medicine | Clinical/translational medicine | ~400 |
Cell Reports Physical Science | Chemistry, physics, materials, energy | ~400 |
Cell Reports Methods | Methods in life sciences | ~200 |
Cell Reports Sustainability | Environmental sustainability | ~100 |
All five journals follow the Cell Press AI policy identically. If you've read the rules for one, you've read them for all.
Cell Reports Medicine deserves special attention. It publishes clinical and translational research, which means the same heightened AI considerations that apply at Nature Medicine or JAMA apply here: don't use AI for clinical data interpretation, don't process patient data through cloud AI tools, and be explicit about what AI did and didn't touch.
Writing the STAR Methods disclosure
For a cell biology paper:
"During preparation of this manuscript, the authors used ChatGPT (GPT-4, OpenAI) to improve the clarity of the Discussion section. All AI-suggested text edits were reviewed by the corresponding author (A.B.). The authors take full responsibility for the published content."
For a genomics/bioinformatics paper:
"The RNA-seq analysis pipeline was built using DESeq2, edgeR, and custom R scripts (see STAR Methods: RNA-seq Analysis). GitHub Copilot (Microsoft) was used to assist with writing the custom gene ontology enrichment scripts. All code was validated against published datasets. ChatGPT (GPT-4, OpenAI) was used to improve the language of the Results section. The authors take full responsibility for the content."
For a neuroscience paper submitted to Cell Reports:
"Calcium imaging data was processed using Suite2p (Pachitariu et al., 2017) as described in STAR Methods: Imaging Analysis. During manuscript preparation, Claude (Claude 3.5, Anthropic) was used to edit the Introduction and Methods sections for language clarity. All AI-generated suggestions were reviewed by the senior author (C.D.). The authors take full responsibility for the published content."
For a Cell Reports Medicine paper:
"No patient data was processed through cloud-based AI tools. During manuscript preparation, ChatGPT (GPT-4, OpenAI) was used to improve the readability of the Discussion. No AI tools were used for clinical data interpretation or outcome reporting. All AI-suggested text was reviewed by the clinical investigators (E.F. and G.H.). The authors take full responsibility for the content."
What requires disclosure at Cell Reports
Use case | Disclosure required? | Notes |
|---|---|---|
Grammar/spell check | No | Standard tools exempt |
ChatGPT for language editing | Yes | STAR Methods, Method Details |
AI for bioinformatics code | Yes | Specify which pipeline steps |
Research software (DESeq2, Seurat, etc.) | No (research tool) | Standard STAR Methods |
AI-generated diagrams | Prohibited | BioRender, Illustrator are fine |
AI for figure legends | Yes | Part of the manuscript |
AI for graphical abstract | Prohibited if generative | Standard design tools only |
AI for STAR Methods Key Resources Table | Minor, formatting only | Content must be author-generated |
Translation of manuscript | Yes | Name tool and languages |
AI for reviewer response drafting | Not strictly required | Update disclosure if manuscript was substantially revised |
Consequences of non-disclosure
Cell Press enforcement follows the standard COPE-guided process:
During review:
- Request to add disclosure to STAR Methods
- Deliberate concealment can lead to rejection
- If AI involvement in analysis code is suspected, additional reviewer scrutiny may be requested
After publication:
- Correction for minor language editing non-disclosure
- Expression of concern if AI affected data analysis or interpretation
- Retraction for fabricated data or false claims
The open access factor: Cell Reports is fully OA, meaning every paper, and every correction or retraction, is freely visible worldwide. Unlike a subscription journal where corrections might only be noticed by active subscribers, a Cell Reports correction shows up for anyone searching for the paper on Google Scholar, PubMed, or the journal's website. The reputational cost of a post-publication issue is amplified by universal access.
Volume and pattern detection: Because Cell Reports processes so many manuscripts, the editorial team accumulates pattern-recognition experience with AI disclosure issues. Editors who've seen hundreds of disclosure statements can spot missing or inadequate disclosures more efficiently than editors at low-volume journals. Don't assume the volume means lower scrutiny, it means more experienced scrutiny.
Comparison with other broad-scope life science journals
Feature | Cell Reports | Nature Communications | PLOS ONE | eLife | Scientific Reports |
|---|---|---|---|---|---|
Publisher | Cell Press (Elsevier) | Springer Nature | PLOS | eLife Sciences | Springer Nature |
Articles/year | 2,000-2,500 | 6,000+ | 15,000+ | 1,500+ | 20,000+ |
AI authorship | Prohibited | Prohibited | Prohibited | Prohibited | Prohibited |
Disclosure location | STAR Methods | Methods | Methods | Methods | Methods |
AI image ban | Yes | Yes | Yes | Yes | Yes |
Access model | Gold OA | Gold OA | Gold OA | Diamond OA | Gold OA |
APC | ~$4,530 | ~$5,790 | ~$1,805 | $0 | ~$2,190 |
Cell Reports sits between the elite Cell Press journals (Cell, Cancer Cell) and the mega-journals (PLOS ONE, Scientific Reports) in terms of selectivity and per-paper editorial attention. Its AI enforcement reflects this middle position: more systematic than a mega-journal, less intensive per-paper than Cell itself.
How Elsevier's policy layers with Cell Press's
Aspect | Elsevier (general) | Cell Press / Cell Reports |
|---|---|---|
Policy text | Broad guidelines | More prescriptive |
Disclosure location | Flexible | STAR Methods required |
Example language | General | Specific examples in guidelines |
Editorial screening | Varies by journal | Active at Cell Press |
If you're submitting to a non-Cell-Press Elsevier journal (like a journal in the Lancet family or a standard Elsevier title), the disclosure placement is more flexible. At Cell Reports, it's specifically STAR Methods → Method Details. This formatting requirement is non-negotiable.
Practical advice for Cell Reports submissions
For standard research articles:
- Use the STAR Methods AI disclosure as your primary location
- Include tool name, version, and specific use case
- If you used AI during revisions, update the STAR Methods disclosure in your revised manuscript
For papers with computational analysis:
- Deposit all code in a public repository
- Clearly separate research software from AI writing tools in STAR Methods
- AI-generated analysis code should be validated against known results
For Cell Reports Medicine submissions:
- Don't process patient data through cloud AI
- Keep AI away from clinical interpretation
- Be explicit about what AI didn't do, especially for clinical content
For non-native English speakers:
- AI-assisted language editing is perfectly acceptable at Cell Reports
- Disclose it in STAR Methods with the tool name, version, and what you asked it to do
- This is a legitimate use case that Cell Press has publicly supported
Before submission checklist:
- [ ] AI disclosure in STAR Methods → Method Details
- [ ] Tool names, versions, and specific use cases listed
- [ ] Research tools in standard STAR Methods (not AI disclosure)
- [ ] No generative AI images or graphical abstract
- [ ] Analysis code validated and deposited
- [ ] All co-authors reviewed the disclosure
- [ ] Graphical abstract made with BioRender, Illustrator, or similar
A free manuscript assessment can help verify your Cell Reports submission meets Cell Press requirements before submission.
Sources
Reference library
Use the core publishing datasets alongside this guide
This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.
Dataset / reference guide
Peer Review Timelines by Journal
Reference-grade journal timeline data that authors, labs, and writing centers can cite when discussing realistic review timing.
Dataset / benchmark
Biomedical Journal Acceptance Rates
A field-organized acceptance-rate guide that works as a neutral benchmark when authors are deciding how selective to target.
Reference table
Journal Submission Specs
A high-utility submission table covering word limits, figure caps, reference limits, and formatting expectations.
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Start here
Same journal, next question
- Cell Reports Submission Guide
- How to Avoid Desk Rejection at Cell Reports
- Is Cell Reports a Good Journal? A Real Fit Verdict for Authors
- Cell Reports Review Time: What to Expect Before and After Peer Review
- Cell Reports Impact Factor 2026: 6.9, Q1, Rank 44/204
- Cell Reports Acceptance Rate: What 15-20% Means When You're Submitting
Supporting reads
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.