Pre-Submission Review for Materials Science Manuscripts: What Reviewers Expect
Materials science manuscripts face specific scrutiny on characterization completeness, performance benchmarking, and data presentation. Here is what reviewers at top materials journals actually look for.
Senior Researcher, Oncology & Cell Biology
Author context
Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.
Readiness scan
Before you submit to Materials, pressure-test the manuscript.
Run the Free Readiness Scan to catch the issues most likely to stop the paper before peer review.
How to use this page well
These pages work best when they behave like tools, not essays. Use the quick structure first, then apply it to the exact journal and manuscript situation.
Question | What to do |
|---|---|
Use this page for | Building a point-by-point response that is easy for reviewers and editors to trust. |
Start with | State the reviewer concern clearly, then pair each response with the exact evidence or revision. |
Common mistake | Sounding defensive or abstract instead of specific about what changed. |
Best next step | Turn the response into a visible checklist or matrix before you finalize the letter. |
Decision cue: Materials science reviewers expect three things that many authors underdeliver on: complete characterization of every new material, performance benchmarking against existing alternatives, and figures that communicate results clearly without the caption doing all the work. Missing any of these turns a strong materials study into a desk rejection at Advanced Materials, Nature Materials, or ACS Nano.
Check your materials science manuscript readiness in 60 seconds with the free scan.
What materials science reviewers check first
Characterization completeness
For any new material reported in the manuscript, reviewers expect full characterization: structural (XRD, TEM, SEM), compositional (XPS, EDS, ICP), and functional (the property measurements relevant to the claimed application). A new catalyst needs activity data, selectivity data, and stability data. A new nanomaterial needs size distribution, surface chemistry, and purity analysis.
The most common characterization failure is not missing every technique but missing the one that answers the obvious question. A photocatalyst paper without action spectrum data. A battery material without cycling stability. A nanoparticle paper without size distribution beyond the "representative" TEM image. Reviewers notice these gaps immediately because they have seen them hundreds of times.
Performance benchmarking
Materials science is competitive. A new material that improves on the state of the art needs to prove it. Reviewers expect a comparison table showing your material's performance alongside the best published alternatives under comparable conditions.
The most common benchmarking failure is comparing against outdated baselines. Citing a 2015 benchmark when a 2024 paper reported significantly better performance signals that the literature review is incomplete. Check the last 2 years of publications in your target journal to ensure your comparison is current.
Data presentation and figure quality
Materials science papers are figure-heavy. A typical paper in Advanced Materials or ACS Nano has 4 to 8 main figures plus supplementary figures. Reviewers evaluate:
- whether each figure communicates its key result without requiring the caption to explain
- whether scale bars are present and correctly labeled on all microscopy images
- whether axes are labeled with units and appropriate ranges
- whether error bars are present and defined (SD, SEM, CI)
- whether color schemes are consistent across figures
- whether comparison data are presented on the same axes (not in separate panels that make comparison difficult)
The materials science pre-submission checklist
For new materials
- full structural characterization (XRD, TEM/SEM, or equivalent)
- compositional analysis (XPS, EDS, NMR, or equivalent)
- purity or quality metrics
- all claimed functional properties measured and reported
- synthesis is described in enough detail for reproduction (reagent sources, temperatures, times, atmosphere)
For performance claims
- benchmarking table comparing to published state-of-the-art (last 2 years)
- comparison conducted under equivalent conditions (same electrolyte, same temperature, same loading)
- stability or durability data (cycling, long-term operation, or accelerated aging)
- statistical treatment of performance data (mean, standard deviation, number of samples)
For figures
- every figure has a clear take-home message visible without reading the caption
- scale bars on all microscopy images with correct labels
- axes labeled with units on all plots
- error bars present and defined
- consistent color scheme across the manuscript
- no panels included that are not discussed in the results
For reproducibility
- synthesis methods include specific reagent sources, catalog numbers where relevant
- characterization conditions specified (instrument, parameters)
- analysis code available if computational work is included
- data available in a public repository or supplementary material
Where materials science reviews commonly go wrong
"Interesting material, no application data." Characterizing a new material is chemistry. Showing it does something useful is materials science. If your paper describes a synthesis and characterization without demonstrating performance in a real application, selective materials journals will redirect you to a chemistry journal.
"Performance is good but not benchmarked." Claiming "high efficiency" or "excellent performance" without comparing to published alternatives is a credibility issue. Include a comparison table with specific numbers from specific papers, not vague references to "existing approaches."
"Stability not addressed." A material that performs well once is not useful. Reviewers want to know whether the performance lasts. Cycling data for batteries. Long-term operation for catalysts. Aging data for devices. Missing stability data signals that the material may not be practical.
"Figures are confusing." Materials science papers depend on figures more than most fields. A confusing figure raises the question of whether the data are confusing, which raises the question of whether the results are reliable.
How Manusights helps with materials science manuscripts
The free readiness scan evaluates methodology, citation integrity, and journal fit in about 60 seconds. For materials science manuscripts, the citation verification is especially valuable: ensuring that your benchmarking references are current and that no key competing materials are missing from your comparison.
The $29 AI Diagnostic provides figure-level feedback, which is particularly important for figure-heavy materials science papers. The diagnostic identifies figure-text inconsistencies, checks whether all panels are referenced in the results, and evaluates whether the data presentation is appropriate.
For manuscripts targeting Advanced Materials, Nature Materials, or ACS Nano, Manusights Expert Review ($1,000 to $1,800) connects you with a reviewer who has published in and reviewed for those journals and can evaluate both the materials characterization and the editorial framing.
Sources
On this page
Reference library
Use the core publishing datasets alongside this guide
This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.
Dataset / reference guide
Peer Review Timelines by Journal
Reference-grade journal timeline data that authors, labs, and writing centers can cite when discussing realistic review timing.
Dataset / benchmark
Biomedical Journal Acceptance Rates
A field-organized acceptance-rate guide that works as a neutral benchmark when authors are deciding how selective to target.
Reference table
Journal Submission Specs
A high-utility submission table covering word limits, figure caps, reference limits, and formatting expectations.
Final step
Submitting to Materials?
Run the Free Readiness Scan to see score, top issues, and journal-fit signals before you submit.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Need deeper scientific feedback? See Expert Review Options
Where to go next
Start here
Same journal, next question
Supporting reads
Conversion step
Submitting to Materials?
Anthropic Privacy Partner. Zero-retention manuscript processing.