Manuscript Preparation6 min readUpdated Apr 20, 2026

Pre-Submission Review for Materials Science Manuscripts: What Reviewers Expect

Materials science manuscripts face specific scrutiny on characterization completeness, performance benchmarking, and data presentation. Here is what reviewers at top materials journals actually look for.

Senior Researcher, Oncology & Cell Biology

Author context

Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.

Readiness scan

Before you submit to Materials, pressure-test the manuscript.

Run the Free Readiness Scan to catch the issues most likely to stop the paper before peer review.

Check my manuscriptAnthropic Privacy Partner. Zero-retention manuscript processing.See sample reportOr find your best-fit journal
Journal context

Materials at a glance

Key metrics to place the journal before deciding whether it fits your manuscript and career goals.

Full journal profile
Impact factor3.2Clarivate JCR
Acceptance rate~50-60%Overall selectivity
Time to decision~70-100 days medianFirst decision
Open access APC~$1,800-2,200Gold OA option

What makes this journal worth targeting

  • IF 3.2 puts Materials in a visible tier — citations from papers here carry real weight.
  • Scope specificity matters more than impact factor for most manuscript decisions.
  • Acceptance rate of ~~50-60% means fit determines most outcomes.

When to look elsewhere

  • When your paper sits at the edge of the journal's stated scope — borderline fit rarely improves after submission.
  • If timeline matters: Materials takes ~~70-100 days median. A faster-turnaround journal may suit a grant or job deadline better.
  • If OA is required: gold OA costs ~$1,800-2,200. Check institutional agreements before submitting.
Working map

How to use this page well

These pages work best when they behave like tools, not essays. Use the quick structure first, then apply it to the exact journal and manuscript situation.

Question
What to do
Use this page for
Building a point-by-point response that is easy for reviewers and editors to trust.
Start with
State the reviewer concern clearly, then pair each response with the exact evidence or revision.
Common mistake
Sounding defensive or abstract instead of specific about what changed.
Best next step
Turn the response into a visible checklist or matrix before you finalize the letter.

Quick answer: Pre-submission review materials science should test whether the paper already has complete characterization, honest benchmarking, clear figures, and enough durability evidence for the journal you want. Missing any one of those can turn a strong study into a fast rejection at top materials journals because reviewers often judge the paper visually and comparatively before they read every line.

Materials science pre-submission review is worth doing when it tests the three places reviewers attack first: characterization completeness, benchmarking honesty, and figure clarity. A manuscript can contain real science and still fail because one of those three foundations is obviously incomplete.

If the paper would look shaky to a reviewer reading only the abstract and the main figures, the submission is not ready yet.

Check your materials science manuscript readiness in 1-2 minutes with the free scan.

Pre-submission review materials science: what reviewers screen first

For any new material reported in the manuscript, reviewers expect full characterization: structural (XRD, TEM, SEM), compositional (XPS, EDS, ICP), and functional (the property measurements relevant to the claimed application). A new catalyst needs activity data, selectivity data, and stability data. A new nanomaterial needs size distribution, surface chemistry, and purity analysis.

The most common characterization failure is not missing every technique but missing the one that answers the obvious question. A photocatalyst paper without action spectrum data. A battery material without cycling stability. A nanoparticle paper without size distribution beyond the "representative" TEM image. Reviewers notice these gaps immediately because they have seen them hundreds of times.

In our pre-submission review work

In our pre-submission review work, materials manuscripts usually fail in one of three places. The characterization package looks broad but misses the one method that tests the central claim. The benchmark table is real but too easy, outdated, or condition-mismatched. Or the first figure looks exciting until the reader notices that the stability or device-level relevance is still thin.

Our review of current materials-journal author guidance points the same way. Editors are not only screening for novelty. They are screening for whether the evidence package already looks complete enough that a skeptical reviewer does not have to request the obvious missing experiment.

Performance benchmarking

Materials science is competitive. A new material that improves on the state of the art needs to prove it. Reviewers expect a comparison table showing your material's performance alongside the best published alternatives under comparable conditions.

The most common benchmarking failure is comparing against outdated baselines. Citing a 2015 benchmark when a 2024 paper reported significantly better performance signals that the literature review is incomplete. Check the last 2 years of publications in your target journal to ensure your comparison is current.

Data presentation and figure quality

Materials science papers are figure-heavy. A typical paper in Advanced Materials or ACS Nano has 4 to 8 main figures plus supplementary figures. Reviewers evaluate:

  • whether each figure communicates its key result without requiring the caption to explain
  • whether scale bars are present and correctly labeled on all microscopy images
  • whether axes are labeled with units and appropriate ranges
  • whether error bars are present and defined (SD, SEM, CI)
  • whether color schemes are consistent across figures
  • whether comparison data are presented on the same axes (not in separate panels that make comparison difficult)

For new materials

  • full structural characterization (XRD, TEM/SEM, or equivalent)
  • compositional analysis (XPS, EDS, NMR, or equivalent)
  • purity or quality metrics
  • all claimed functional properties measured and reported
  • synthesis is described in enough detail for reproduction (reagent sources, temperatures, times, atmosphere)

For performance claims

  • benchmarking table comparing to published state-of-the-art (last 2 years)
  • comparison conducted under equivalent conditions (same electrolyte, same temperature, same loading)
  • stability or durability data (cycling, long-term operation, or accelerated aging)
  • statistical treatment of performance data (mean, standard deviation, number of samples)

For figures

  • every figure has a clear take-home message visible without reading the caption
  • scale bars on all microscopy images with correct labels
  • axes labeled with units on all plots
  • error bars present and defined
  • consistent color scheme across the manuscript
  • no panels included that are not discussed in the results

For reproducibility

  • synthesis methods include specific reagent sources, catalog numbers where relevant
  • characterization conditions specified (instrument, parameters)
  • analysis code available if computational work is included
  • data available in a public repository or supplementary material

Where materials science reviews commonly go wrong

"Interesting material, no application data." Characterizing a new material is chemistry. Showing it does something useful is materials science. If your paper describes a synthesis and characterization without demonstrating performance in a real application, selective materials journals will redirect you to a chemistry journal.

"Performance is good but not benchmarked." Claiming "high efficiency" or "excellent performance" without comparing to published alternatives is a credibility issue. Include a comparison table with specific numbers from specific papers, not vague references to "existing approaches."

"Stability not addressed." A material that performs well once is not useful. Reviewers want to know whether the performance lasts. Cycling data for batteries. Long-term operation for catalysts. Aging data for devices. Missing stability data signals that the material may not be practical.

"Figures are confusing." Materials science papers depend on figures more than most fields. A confusing figure raises the question of whether the data are confusing, which raises the question of whether the results are reliable.

Submit If / Think Twice If

Submit if

  • the material demonstrates clear functional performance in a realistic application context
  • benchmarking against recent published alternatives is thorough and honest
  • stability or cycling data supports the durability of the claimed performance
  • figures are clean, logically sequenced, and directly referenced in the text

Think twice if

  • the paper describes synthesis and characterization without application-level demonstration
  • the performance claims lack comparison to specific published numbers from recent work
  • stability or long-term operation data is missing entirely
  • the paper reads as chemistry rather than materials science

Readiness check

Run the scan while Materials's requirements are in front of you.

See how this manuscript scores against Materials's requirements before you submit.

Check my readinessAnthropic Privacy Partner. Zero-retention manuscript processing.See sample reportOr find your best-fit journal

How Manusights helps with materials science manuscripts

The manuscript readiness check evaluates methodology, citation integrity, and journal fit in about 1-2 minutes. For materials science manuscripts, the citation verification is especially valuable: ensuring that your benchmarking references are current and that no key competing materials are missing from your comparison.

The manuscript readiness check provides figure-level feedback, which is particularly important for figure-heavy materials science papers. The diagnostic identifies figure-text inconsistencies, checks whether all panels are referenced in the results, and evaluates whether the data presentation is appropriate.

For manuscripts targeting Advanced Materials, Nature Materials, or ACS Nano, Manusights Expert Review ($1,000 to $1,800) connects you with a reviewer who has published in and reviewed for those journals and can evaluate both the materials characterization and the editorial framing.

Fast pre-submit matrix

The easiest way to pressure-test a materials manuscript is to ask what a skeptical reviewer would attack on the first figure pass.

Reviewer question
What a weak package looks like
What a strong package looks like
Is the material fully characterized?
One attractive technique stands in for the whole identity claim
Structural, compositional, and functional evidence work together
Is the performance claim actually benchmarked?
Outdated or non-comparable baselines
Recent, condition-matched comparisons with clear numbers
Does the stability story exist?
One strong initial result, little durability evidence
Cycling, aging, or operation data that supports practical use
Do the figures carry the argument clearly?
The caption has to rescue the panel
The result is visible before the reader decodes the caption

A final materials-science checklist

Before submission, confirm:

  • the strongest claim in the abstract is supported by a figure that can survive skeptical reading
  • every benchmark is recent enough and compared under meaningfully similar conditions
  • the missing experiment a reviewer would obviously request has either been done or openly acknowledged
  • the paper reads like a materials result with function, not just a characterization package with ambition added later
  • supplementary data strengthens the main story instead of hiding essential controls

That is the difference between a paper that looks complete and one that looks one experiment short of the journal you want.

Materials manuscripts are often judged visually before they are judged line by line. If the title, first figure, benchmark framing, and stability evidence do not align quickly, reviewers start reading with distrust. A good pre-submission review should lower that distrust before the manuscript ever enters the portal.

That is why the best materials review is rarely generic. It should sound like someone who knows what a top materials reviewer would ask next and can see the missing comparison, missing durability proof, or missing figure logic before the journal does.

When is field-specific pre-submission review worth it?

Worth the investment if:

  • You are targeting a journal with <20% acceptance in this field
  • The paper is career-critical (tenure, grant, job market)
  • A desk rejection would cost 3-6 months in resubmission cycles
  • You want field-matched reviewer feedback before submission

Skip if:

  • Experienced colleagues in this field have already reviewed the manuscript
  • Your timeline is too tight to act on feedback
  • The paper is going to a journal where you have published before

Frequently asked questions

They expect characterization that is proportional to the claim. Structural, compositional, and functional evidence all need to work together, and missing the one obvious validation technique for the material class still triggers fast skepticism.

Use recent, condition-matched comparisons that a reviewer can audit quickly. Benchmarking against outdated literature or under easier conditions damages trust more than a slightly weaker but honest result.

The material looks interesting, but the application-level proof, durability story, or benchmark fairness is not strong enough for the ambition of the journal. Many papers are one comparison or one stability experiment short.

Often yes, especially when characterization completeness, benchmarking fairness, or figure clarity are still open to attack. Those issues slow down review cycles at almost every level of the field.

References

Sources

  1. Why Was My Paper Rejected without Review? (ES&T)
  2. Common rejection reasons (Springer Nature)
  3. Nature Materials publishing options
  4. Author guidelines for Journal of Materials Chemistry A

Final step

Submitting to Materials?

Run the Free Readiness Scan to see score, top issues, and journal-fit signals before you submit.

Anthropic Privacy Partner. Zero-retention manuscript processing.

Internal navigation

Where to go next

Check my manuscript