Molecular Cell's AI Policy: Cell Press Rules for Structural and Molecular Biology Authors
Molecular Cell follows Cell Press AI rules requiring disclosure in STAR Methods, prohibiting AI authorship and AI-generated images, with specific guidance for AlphaFold and cryo-EM workflows.
Senior Researcher, Oncology & Cell Biology
Author context
Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Molecular biology sits at the center of AI's impact on science, not because of ChatGPT, but because of AlphaFold. When DeepMind's protein structure prediction tool reshaped how every structural biologist works, it created a generation of researchers who use AI as a daily research instrument. Molecular Cell's AI policy doesn't cover AlphaFold (that's a research tool), but it does cover the AI tools you might use to write about your AlphaFold-informed experiments. For a field that's already deeply comfortable with AI, the distinction between research AI and writing AI is intuitive but still needs to be documented correctly.
The Cell Press policy
Molecular Cell follows the Cell Press AI policy without modification. Same rules as Cell, Cancer Cell, Immunity, Cell Metabolism, Neuron, and Cell Reports:
- AI can't be an author. Generative AI tools don't meet Cell Press authorship criteria.
- AI use must be disclosed in STAR Methods. Specifically under Method Details.
- AI-generated images are prohibited. No generative AI figures, graphical abstracts, or illustrations.
- Authors are fully accountable. Every co-author takes responsibility for all content.
- All preparation phases count. AI use at any stage of writing requires disclosure.
The policy layers with Elsevier's broader guidelines (Cell Press is part of Elsevier), but Cell Press's STAR Methods requirement adds structure beyond what general Elsevier journals mandate.
The AlphaFold question
This comes up constantly for Molecular Cell authors. Does showing an AlphaFold-predicted structure as a figure violate the AI-generated image ban?
No. Here's why:
AlphaFold (and similar tools like ESMFold, RoseTTAFold, OmegaFold) generates computational predictions from protein sequence data. The output is a structural model, a scientific result derived from data through a defined computational method. This is fundamentally different from asking Midjourney to "draw a protein structure" based on a text prompt.
The same principle applies to:
- Molecular dynamics simulation snapshots, computational results, not generated images
- Docking predictions rendered as figures, scientific outputs
- Homology models built with SWISS-MODEL or similar, computational results
- Electrostatic surface maps calculated from structure data, data-derived visualizations
What IS prohibited:
- Using DALL-E to generate a schematic of protein-protein interactions
- Using Midjourney to create a graphical abstract showing molecular mechanisms
- Using any generative AI to illustrate a biological concept that isn't derived from data
How to describe AlphaFold in STAR Methods:
AlphaFold belongs in your standard Method Details as a computational tool:
"Protein structure predictions were generated using AlphaFold2 (Jumper et al., 2021) with default parameters. Predicted structures with pLDDT scores >70 were used for subsequent analysis. Structural figures were rendered in PyMOL (Schrödinger)."
This is research methodology, not an AI writing disclosure. It goes in the computational methods subsection, not alongside your ChatGPT disclosure.
Writing the STAR Methods AI disclosure
For a structural biology paper:
"Protein structure predictions were performed using AlphaFold2 as described in STAR Methods: Structural Prediction. Separately, during manuscript preparation, the authors used ChatGPT (GPT-4, OpenAI) to improve the language clarity of the Discussion section. All AI-suggested text edits were reviewed by the corresponding author (J.K.). The authors take full responsibility for the published content."
For a paper with cryo-EM data:
"Cryo-EM data processing was performed using cryoSPARC v4 and RELION 4.0 (see STAR Methods: Cryo-EM Data Processing). During manuscript preparation, GitHub Copilot (Microsoft) was used to assist with writing Python scripts for automated particle picking parameter optimization. ChatGPT (GPT-4, OpenAI) was used to improve the readability of the Results section. All code was validated against established processing workflows, and all text edits were reviewed by the authors."
For a biochemistry/enzymology paper:
"During preparation of this manuscript, the authors used Claude (Claude 3.5, Anthropic) to edit the Introduction for language clarity and to assist with formatting the kinetic parameters table. All content was verified against the experimental data by the senior author (M.R.). The authors take full responsibility for the content of this article."
The cryo-EM software distinction
Molecular Cell publishes many cryo-EM structures. The software stack for cryo-EM processing is extensive:
Research tools (STAR Methods, standard description): RELION, cryoSPARC, MotionCor2, CTFFIND, Phenix, Coot, ChimeraX
AI-assisted tools that are research methods: DeepEMhancer (AI-based map sharpening), Topaz (AI-based particle picking), ModelAngelo (AI-based model building)
Writing tools requiring disclosure: ChatGPT for editing text, Copilot for writing processing scripts
The middle category, AI tools used for cryo-EM processing, belongs in your standard STAR Methods as part of your research pipeline. These are computational tools, not manuscript preparation aids. Describe them the way you'd describe RELION: name, version, parameters, what they were used for.
What requires disclosure at Molecular Cell
Use case | Disclosure required? | Notes |
|---|---|---|
Grammar/spell check | No | Standard tools exempt |
ChatGPT for language editing | Yes | STAR Methods, Method Details |
AlphaFold for structure prediction | No (research tool) | Standard STAR Methods |
Copilot for processing scripts | Yes | Specify which scripts |
AI for molecular dynamics code | Yes | Confirm validation |
DeepEMhancer for map sharpening | No (research tool) | Standard STAR Methods |
AI-generated protein schematics | Prohibited if generative | Use PyMOL, ChimeraX, BioRender |
AI for figure layout | Gray area, disclose if substantial | Aesthetic arrangement vs. content generation |
AI for table formatting | Minor, but disclose if AI generated content | Formatting vs. creating table data |
AI to edit STAR Methods text | Yes | STAR Methods is part of the manuscript |
Consequences of non-disclosure
Cell Press follows standard COPE-guided enforcement:
During review:
- Request to add disclosure to STAR Methods
- If AI involvement in model building or structure validation is suspected, additional review may be required
- Deliberate concealment can lead to rejection
After publication:
- Correction for undisclosed language editing
- Expression of concern if AI affected structural analysis or interpretation
- Retraction for fabricated data or false structural claims
The structural biology stakes: If you report a cryo-EM structure or crystal structure in Molecular Cell and didn't disclose AI involvement in the model building or refinement process, the structure's credibility is at stake. PDB depositions are permanent scientific records. If a deposited structure is later found to have undisclosed AI involvement in its building, this creates problems for everyone who cited or used that structure.
To be clear: using AI tools like ModelAngelo or DeepEMhancer for structure determination is perfectly legitimate and should be described in Methods as computational tools. The issue is when AI use in any part of the process, including writing, isn't properly documented.
Comparison with other molecular biology journals
Feature | Molecular Cell | Nature Structural & Molecular Biology | Cell | EMBO Journal | eLife |
|---|---|---|---|---|---|
Publisher | Cell Press (Elsevier) | Springer Nature | Cell Press (Elsevier) | EMBO Press | eLife Sciences |
AI authorship | Prohibited | Prohibited | Prohibited | Prohibited | Prohibited |
Disclosure location | STAR Methods | Methods | STAR Methods | Methods | Methods |
AI image ban | Yes | Yes | Yes | Yes | Yes |
Cryo-EM papers | Common | Very common | Common | Common | Common |
AlphaFold papers | Common | Very common | Common | Common | Very common |
PDB deposition required | Yes | Yes | Yes | Yes | Yes |
Nature Structural & Molecular Biology (NSMB) is Molecular Cell's most direct competitor. NSMB follows Springer Nature's AI policy with free-form Methods disclosure; Molecular Cell uses STAR Methods. The substantive requirements are identical. If you're preparing a manuscript for both journals as backup options, the AI disclosure content will be the same, only the formatting changes.
How the publisher-wide policy applies at Molecular Cell
Aspect | Cell Press (general) | Molecular Cell (in practice) |
|---|---|---|
Policy text | Standard | Identical |
AlphaFold figure handling | General guidance | Frequently relevant |
Cryo-EM AI tools | Not specifically addressed | Common, need clear categorization |
Computational code complexity | Moderate | High, many custom scripts |
Reviewer computational expertise | Varies by journal | Consistently high |
PDB deposition overlap | Not all journals | Standard requirement |
Practical advice for Molecular Cell submissions
For structural biology papers:
- Clearly distinguish AlphaFold/computational predictions (research tools) from ChatGPT/Copilot (writing tools) in STAR Methods
- If you used AI-based tools for model building (ModelAngelo), describe them alongside RELION and Phenix in your processing pipeline
- Structural figures rendered in PyMOL or ChimeraX from real or predicted structures aren't AI-generated images
For biochemistry and biophysics papers:
- If AI helped with kinetic modeling code, disclose this and confirm the models were validated against raw data
- For binding assay analysis, specify whether AI assisted with curve fitting code
- Don't use AI to generate hypothetical binding models in the Discussion unless clearly labeled as speculative
For gene regulation and epigenetics papers:
- If AI helped with ChIP-seq, ATAC-seq, or Hi-C analysis code, disclose and validate
- Genomic data processing tools (BWA, MACS2, HiC-Pro) are research tools in standard STAR Methods
- Custom scripts written with AI assistance need separate disclosure
Before submission checklist:
- [ ] AI disclosure in STAR Methods → Method Details
- [ ] Research tools (AlphaFold, RELION, cryoSPARC) in standard STAR Methods
- [ ] Writing tools (ChatGPT, Copilot) in separate AI disclosure
- [ ] No generative AI images or graphical abstract
- [ ] Structural figures derived from data (PyMOL, ChimeraX renders are fine)
- [ ] All code validated and deposited
- [ ] PDB deposition includes accurate method descriptions
- [ ] All co-authors reviewed AI disclosure
A free manuscript assessment can help confirm your Molecular Cell submission meets Cell Press requirements before submission.
Sources
Reference library
Use the core publishing datasets alongside this guide
This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.
Dataset / reference guide
Peer Review Timelines by Journal
Reference-grade journal timeline data that authors, labs, and writing centers can cite when discussing realistic review timing.
Dataset / benchmark
Biomedical Journal Acceptance Rates
A field-organized acceptance-rate guide that works as a neutral benchmark when authors are deciding how selective to target.
Reference table
Journal Submission Specs
A high-utility submission table covering word limits, figure caps, reference limits, and formatting expectations.
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.