Journal Guides5 min readUpdated Apr 28, 2026

Implementation Science Submission Guide

Science's submission process, first-decision timing, and the editorial checks that matter before peer review begins.

Senior Researcher, Oncology & Cell Biology

Author context

Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.

Readiness scan

Before you submit to Science, pressure-test the manuscript.

Run the Free Readiness Scan to catch the issues most likely to stop the paper before peer review.

Check my readinessAnthropic Privacy Partner. Zero-retention manuscript processing.See sample reportOr find your best-fit journal
Submission at a glance

Key numbers before you submit to Science

Acceptance rate, editorial speed, and cost context — the metrics that shape whether and how you submit.

Full journal profile
Impact factor45.8Clarivate JCR
Acceptance rate<7%Overall selectivity
Time to decision~14 days to first decisionFirst decision

What acceptance rate actually means here

  • Science accepts roughly <7% of submissions — but desk rejection runs higher.
  • Scope misfit and framing problems drive most early rejections, not weak methodology.
  • Papers that reach peer review face a different bar: novelty, rigor, and fit with the journal's editorial identity.

What to check before you upload

  • Scope fit — does your paper address the exact problem this journal publishes on?
  • Desk decisions are fast; scope problems surface within days.
  • Cover letter framing — editors use it to judge fit before reading the manuscript.
Submission map

How to approach Science

Use the submission guide like a working checklist. The goal is to make fit, package completeness, and cover-letter framing obvious before you open the portal.

Stage
What to check
1. Scope
Presubmission inquiry (optional)
2. Package
Full submission
3. Cover letter
Editorial triage
4. Final check
Peer review

Quick answer: This Implementation Science submission guide is for implementation researchers evaluating their work against the journal's theory and rigor bar. The journal is selective (~20-25% acceptance, 30-40% desk rejection). The editorial standard requires substantive theory-driven implementation-science contributions, not descriptive program evaluations.

If you're targeting Implementation Science, the main risk is descriptive framing, missing theoretical grounding, or weak implementation-outcomes measurement.

From our manuscript review practice

Of submissions we've reviewed for Implementation Science, the most consistent desk-rejection trigger is descriptive program evaluations without theory-driven implementation-research framing.

How this page was created

This page was researched from Implementation Science's author guidelines, BMC editorial-policy materials, Clarivate JCR data, SciRev community reports, and Manusights internal analysis of submissions to Implementation Science and adjacent venues.

Implementation Science Journal Metrics

Metric
Value
Impact Factor (2024 JCR)
9.4
5-Year Impact Factor
~10+
CiteScore
18.0
Acceptance Rate
~20-25%
Desk Rejection Rate
~30-40%
First Decision
4-8 weeks
APC (Open Access)
$2,790 (2026)
Publisher
BMC / Springer Nature

Source: Clarivate JCR 2024, BMC editorial disclosures (accessed April 2026).

Implementation Science Submission Requirements and Timeline

Requirement
Details
Submission portal
BMC Editorial Manager
Article types
Research, Methodology, Study Protocol, Debate, Systematic Review
Article length
4,000-7,000 words typical
Cover letter
Required
First decision
4-8 weeks
Peer review duration
8-14 weeks

Source: Implementation Science author guidelines.

Submission snapshot

What to pressure-test
What should already be true before upload
Implementation-science contribution
Manuscript advances implementation theory or methodology
Theoretical grounding
Engagement with established implementation frameworks (CFIR, PARIHS, RE-AIM, ERIC)
Implementation-outcomes measurement
Adoption, fidelity, sustainability, or comparable measures
Methodological rigor
Appropriate qualitative or quantitative method
Cover letter
Establishes the implementation-science contribution

What this page is for

Use this page when deciding:

  • whether the contribution is implementation-science
  • whether theoretical grounding is rigorous
  • whether implementation-outcomes measurement is appropriate

What should already be in the package

  • a clear implementation-science contribution
  • theoretical grounding in established implementation frameworks
  • implementation-outcomes measurement
  • rigorous methodology
  • a cover letter establishing the implementation-science contribution

Package mistakes that trigger early rejection

  • Descriptive program evaluations without implementation-science framing.
  • Missing theoretical grounding.
  • Weak implementation-outcomes measurement.
  • Clinical effectiveness research without implementation focus.

What makes Implementation Science a distinct target

Implementation Science is the flagship implementation-research journal.

Theory-driven standard: the journal differentiates from clinical trial journals by demanding theory-driven implementation-research framing.

Implementation-outcomes expectation: editors expect measurement of adoption, fidelity, sustainability, or comparable implementation outcomes.

The 30-40% desk rejection rate: decisive editorial screen.

What a strong cover letter sounds like

The strongest Implementation Science cover letters establish:

  • the implementation-science contribution
  • the theoretical grounding
  • the implementation-outcomes measurement
  • the methodological approach

Diagnosing pre-submission problems

Problem
Fix
Descriptive framing
Add theory-driven implementation-research framing
Theoretical grounding is weak
Engage with CFIR, PARIHS, RE-AIM, or other established frameworks
Implementation outcomes are weak
Add adoption, fidelity, sustainability, or comparable measures

Readiness check

Run the scan while Science's requirements are in front of you.

See how this manuscript scores against Science's requirements before you submit.

Check my readinessAnthropic Privacy Partner. Zero-retention manuscript processing.See sample reportOr find your best-fit journal

How Implementation Science compares against nearby alternatives

Method note: the comparison reflects published author guidelines and Manusights internal analysis. We have not personally been Implementation Science authors; the boundary is publicly documented editorial behavior. Pros and cons are based on documented editorial scope.

Factor
Implementation Science
Implementation Science Communications
BMC Health Services Research
Translational Behavioral Medicine
Best fit (pros)
Theory-driven implementation research
Broader implementation reports
Health services research broadly
Translational behavioral research
Think twice if (cons)
Topic is descriptive program evaluation
Topic is theory-driven research
Topic is implementation-specific
Topic is implementation-focused

Submit If

  • the contribution is implementation-science
  • theoretical grounding is rigorous
  • implementation outcomes are measured
  • methodology is rigorous

Think Twice If

  • the manuscript is descriptive program evaluation
  • theoretical grounding is weak
  • the work fits Implementation Science Communications or specialty venue better

In our pre-submission review work with manuscripts targeting Implementation Science

In our pre-submission review work with implementation manuscripts targeting Implementation Science, three patterns generate the most consistent desk rejections.

In our experience, roughly 35% of Implementation Science desk rejections trace to descriptive program-evaluation framing. In our experience, roughly 25% involve missing theoretical grounding. In our experience, roughly 20% arise from weak implementation-outcomes measurement.

  • Descriptive program evaluations without implementation-science framing. Implementation Science editors look for theory-driven research, not just program-evaluation reports. We observe submissions framed as program implementations without theoretical grounding routinely desk-rejected.
  • Missing theoretical grounding in established frameworks. Editors expect engagement with CFIR, PARIHS, RE-AIM, ERIC, or comparable frameworks. We see manuscripts using ad-hoc framing without established frameworks routinely returned.
  • Weak implementation-outcomes measurement. Implementation Science specifically expects measurement of implementation outcomes (adoption, fidelity, acceptability, feasibility, sustainability). We find papers reporting only clinical outcomes without implementation outcomes routinely declined. An Implementation Science theory and outcomes readiness check can identify whether the package supports a submission.

Clarivate JCR 2024 bibliometric data places Implementation Science as the leading implementation-research journal.

What we look for during pre-submission diagnostics

In pre-submission diagnostic work for top implementation-research journals, we consistently see four signals that distinguish strong submissions from weak ones. First, the contribution must be theory-driven, not descriptive; submissions framed as program-evaluation reports without theoretical framing fail at desk screening. Second, theoretical grounding should engage with established implementation frameworks (CFIR, PARIHS, RE-AIM, ERIC). Third, implementation outcomes (adoption, fidelity, sustainability) should be measured alongside any clinical outcomes. Fourth, methodology should be appropriate to the implementation-research question.

How theory-driven framing matters

The single most consistent feedback class we deliver in pre-submission diagnostics for Implementation Science is the descriptive-versus-theory-driven distinction. Implementation Science editors expect theoretical framing, not just program-implementation reports. Submissions framed as "we implemented program X in setting Y" routinely receive "where is the implementation theory?" feedback during desk screening. We coach authors to lead with the implementation-research question and frame the program in service of that question. Papers framed as "we tested how implementation strategy X, grounded in CFIR construct Y, affected adoption and fidelity in setting Z" receive better editorial traction. The same logic applies across implementation-research journals: editors are operating with limited slot inventory, and the submissions that get traction lead with the theory-driven implementation question.

Common pre-submission diagnostic patterns we encounter

Beyond the rubric checks, three pre-submission diagnostic patterns recur most often in the manuscripts we review for Implementation Science. First, manuscripts where the abstract emphasizes clinical or program outcomes without implementation outcomes are flagged at desk for descriptive framing. We recommend the abstract's central sentences state the implementation question, the theoretical framework, and the implementation outcomes measured. Second, manuscripts where implementation strategies are reported without explicit mapping to ERIC taxonomy or comparable framework are flagged for strategy-specification gaps. We recommend explicit mapping of strategies to established taxonomies. Third, manuscripts that lack engagement with Implementation Science's recent issues are at risk of being told the contribution doesn't fit the publication conversation.

Frequently asked questions

Submit through BMC Editorial Manager. The journal accepts unsolicited Research, Methodology, Study Protocols, and Debate articles on implementation science. The cover letter should establish the theory-driven implementation-research contribution.

Implementation Science's 2024 impact factor is around 9.4. Acceptance rate runs ~20-25% with desk-rejection around 30-40%. Median first decisions in 4-8 weeks.

Original research on implementation of evidence-based practices in healthcare: implementation strategies, implementation theory and frameworks, evaluation of implementation outcomes, dissemination research, de-implementation, and implementation methodology.

Most reasons: descriptive program evaluations without implementation-science framing, missing theoretical grounding in implementation frameworks, weak implementation-outcomes measurement, or scope mismatch (clinical effectiveness research without implementation focus).

References

Sources

  1. Implementation Science author guidelines
  2. Implementation Science homepage
  3. BMC editorial policies
  4. Clarivate JCR 2024: Implementation Science
  5. SciRev BMC journals data

Final step

Submitting to Science?

Run the Free Readiness Scan to see score, top issues, and journal-fit signals before you submit.

Anthropic Privacy Partner. Zero-retention manuscript processing.

Internal navigation

Where to go next

Check my readiness