Journal Guides7 min readUpdated Apr 1, 2026

JCI Acceptance Rate

Journal of Clinical Investigation acceptance rate is about 10%. Use it as a selectivity signal, then sanity-check scope, editorial fit, and submission timing.

Author contextAssociate Professor, Clinical Medicine & Public Health. Experience with NEJM, JAMA, BMJ.View profile

Journal evaluation

Want the full picture on Journal of Clinical Investigation?

See scope, selectivity, submission context, and what editors actually want before you decide whether Journal of Clinical Investigation is realistic.

Selectivity context

What Journal of Clinical Investigation's acceptance rate means for your manuscript

Acceptance rate is one signal. Desk rejection rate, scope fit, and editorial speed shape the realistic path more than the headline number.

Full journal profile
Acceptance rate~8-10%Overall selectivity
Impact factor13.6Clarivate JCR
Time to decision2-4 weekFirst decision

What the number tells you

  • Journal of Clinical Investigation accepts roughly ~8-10% of submissions, but desk rejection accounts for a disproportionate share of early returns.
  • Scope misfit drives most desk rejections, not weak methodology.
  • Papers that reach peer review face a higher bar: novelty and fit with editorial identity.

What the number does not tell you

  • Whether your specific paper type (review, letter, brief communication) faces the same rate as full articles.
  • How fast you will hear back — check time to first decision separately.
  • What open access publishing will cost if you choose that route.

Quick answer: The Journal of Clinical Investigation accepts approximately 10% of submissions. With an impact factor of 13.6 (Q1, ranked 5th of 195 in Medicine, General and Internal), JCI occupies an unusual position in the journal landscape: it's one of the few journals that bridges basic mechanistic research and clinical medicine with equal seriousness. The acceptance rate reflects that dual demand.

JCI's overall acceptance rate is roughly 10%. Desk rejection accounts for 60-70% of submissions, typically within 2-3 weeks. Papers that enter review have an estimated 25-35% acceptance rate. The editorial filter tests whether the paper has both mechanistic depth AND disease relevance. Papers that are strong on one but missing the other get filtered.

The numbers

Metric
Value
Overall acceptance rate
~10%
Estimated desk rejection rate
60-70%
Post-review acceptance rate
~25-35% (estimated)
Impact Factor (2024 JCR)
13.6
JCR quartile
Q1
JCR rank
5/195 (Medicine, General and Internal)
Publisher
American Society for Clinical Investigation (ASCI)
Time to desk decision
2-3 weeks

What JCI actually selects for

JCI's editorial identity is specific: the journal wants papers that illuminate disease mechanisms. Not basic biology without disease relevance. Not clinical outcomes without mechanistic insight. The sweet spot is mechanistic work that changes how the field understands a disease process.

This is different from Nature Medicine (which leans translational, moving discoveries toward clinical application) and from Cell (which leans mechanistic without requiring disease relevance). JCI wants the mechanism AND the disease.

The distinction matters practically. A study identifying a new signaling pathway controlling inflammation is a Cell paper if the biology is the story. It's a JCI paper if the pathway explains why a specific disease behaves the way it does. It's a Nature Medicine paper if the pathway can be therapeutically targeted and the paper shows early evidence of that.

The acceptance funnel: where papers actually get filtered

Think of JCI's selection as a three-stage funnel.

Stage 1: desk triage (60-70% filtered out)

The editors, who are working scientists, not full-time professional editors like at Nature, evaluate two axes simultaneously. Is the mechanism deep enough? Is the disease relevance real? The desk decision takes 2-3 weeks, slower than Nature or Cell (1-2 weeks), because the dual assessment genuinely takes more time.

Stage 2: peer review (another ~40-50% of reviewed papers filtered)

JCI reviewers don't just want to see a connection between mechanism and disease, they want it experimentally supported. A Discussion paragraph saying "this pathway may be relevant to human disease" isn't enough. The most common post-review rejection: the mechanism is convincing but the disease model is too artificial. Cell-line-only studies with speculative connections to human pathology rarely survive JCI review.

Stage 3: revision (another ~15-25% filtered)

JCI revision requests are often substantial. The most frequent ask: add human data. If your original submission had mouse models but no human validation, expect a request for correlative evidence from patient samples or clinical cohorts. Papers that fail at revision usually fail because the authors can't deliver that data.

Where papers get desk-rejected

Pure basic biology. A paper about a signaling pathway that doesn't connect to a disease process. The mechanism may be excellent, but without disease relevance, JCI's editors redirect it to a basic science journal. This is the single most common desk rejection pattern.

Clinical observation without mechanism. A large cohort study showing that patients with condition X have outcome Y. Important work, but JCI wants to know WHY. The mechanism is the point. These papers belong at JAMA, the Lancet, or NEJM.

The connection is an afterthought. A strong basic science paper with a "disease relevance" paragraph tacked onto the discussion. JCI editors can tell when the disease angle is performative rather than integral to the study design. If the disease wasn't part of your experimental plan (if you added it because you thought it would help at JCI) the editors will see through it.

Wrong disease model. The mechanism is interesting and the disease connection is real, but the disease model doesn't recapitulate human pathology well enough. A study using a mouse model that doesn't faithfully represent the human disease will struggle, even if the mechanism is novel.

Where papers get rejected after review

  • The mechanism is interesting but the disease model is too artificial (cell line studies without in vivo validation)
  • The human data supporting the translational bridge is missing or weak
  • The paper tries to serve both a basic and clinical audience and satisfies neither, it reads like two half-papers
  • Reviewers find that the mechanism was already known, and the disease application isn't new enough to justify a JCI publication
  • The revision didn't deliver the human data or disease-model experiments reviewers requested

Readiness check

See how your manuscript scores against Journal of Clinical Investigation before you submit.

Run the scan with Journal of Clinical Investigation as your target journal. Get a fit signal alongside the IF context.

Check my manuscript fitAnthropic Privacy Partner. Zero-retention manuscript processing.Or compare against 1000+ journals and conferences

How JCI compares to translational medicine competitors

Journal
Acceptance Rate
IF (2024 JCR)
What it selects for
JCI
~10%
13.6
Disease mechanisms with clinical relevance
Nature Medicine
~7%
50.0
Translational research bridging bench to bedside
Science Translational Medicine
~8%
14.6
Translational pipeline from discovery to application
JCI Insight
~20%
6.1
Broader clinical investigation, less mechanistic
Journal of Experimental Medicine
~12%
10.6
Immunology and disease biology

JCI vs Nature Medicine

This is the comparison most authors struggle with. Both want translational work, but the editorial emphasis is different. JCI wants the mechanism to be the protagonist. Nature Medicine wants the translational path to be the protagonist. Here's a concrete test: if you removed the disease context from your paper, would the mechanism still be interesting on its own? If yes, JCI. If the mechanism is mainly interesting because of where it leads therapeutically, Nature Medicine.

Nature Medicine is harder to get into (IF 50.0, ~7% acceptance) and has full-time professional editors who desk-reject faster. JCI's working-scientist editors take longer at the desk but may give you more specific feedback about why a paper doesn't fit.

JCI vs Science Translational Medicine (STM)

STM wants papers further along the translational pipeline. A paper showing a new disease mechanism without any therapeutic angle can work at JCI but probably won't work at STM. A paper showing that a known mechanism can be targeted therapeutically is more STM than JCI. STM also publishes more clinical trial results and biomarker validation studies, work that JCI would find too clinical and not mechanistic enough.

JCI vs JCI Insight

This is the most important comparison for authors considering JCI. JCI Insight (IF 6.1, acceptance ~20%) accepts papers that are clinically interesting but don't quite reach the mechanistic standard of the flagship. If the disease mechanism isn't the central story but the clinical data is strong, JCI Insight is worth considering directly.

Submitting to JCI first and cascading to JCI Insight after rejection is common but costs you 2-4 months. If the mechanism honestly isn't the centerpiece, going directly to JCI Insight is the smarter move.

The ASCI membership factor

JCI is owned by the American Society for Clinical Investigation. ASCI members are elected physician-scientists, and the journal's editorial board reflects that community. This matters because JCI's editorial perspective is that of the physician-scientist, someone who sees patients AND runs a basic science lab. If your paper would excite that specific kind of reader, it fits. If it would excite a pure clinician or a pure basic scientist but not someone who does both, it might not.

Should you submit?

Submit if:

  • the paper reveals a disease mechanism with clear implications for understanding pathology
  • the work combines mechanistic depth (biochemistry, genetics, cell biology) with disease-model evidence
  • human data or clinical samples support the translational relevance
  • the disease angle is integral to the study design, not an afterthought
  • you can identify a specific paragraph in your paper where the mechanism explains the disease, not just correlates with it

Think twice if:

  • the mechanism is strong but has no clear disease connection (basic science journals are better)
  • the clinical data is strong but the mechanism is thin (JAMA, Lancet, or NEJM are better)
  • JCI Insight would serve the paper better with its broader clinical scope and ~20% acceptance rate
  • Nature Medicine or Science Translational Medicine is a more natural editorial fit because the translational pipeline is the story
  • you don't have human data and can't get it during revision, JCI increasingly expects it
  • the paper reads like two separate studies (mechanism + clinical observation) rather than one integrated story

A JCI submission readiness check can help assess whether the mechanistic depth and disease relevance meet JCI's dual standard before you submit.

What Pre-Submission Reviews Reveal About JCI Submissions

In our pre-submission review work evaluating manuscripts targeting the Journal of Clinical Investigation, three patterns generate the most consistent desk rejections. Each reflects JCI's dual requirement: mechanistic depth AND genuine disease relevance, with the two integrated in the experimental design rather than one appended to the other.

Basic biology paper without a real disease model. JCI's editors, who are working physician-scientists rather than full-time professional editors, evaluate two axes simultaneously at triage: is the mechanism deep enough, and is the disease relevance real? The failure pattern is a paper establishing a new signaling pathway, a new molecular interaction, or a new cellular mechanism in a model system, with technically rigorous biochemistry and genetics, where the disease connection exists as a Discussion paragraph explaining why the pathway might be relevant to condition X, without a disease model in the experimental design. JCI's editorial standard requires that the disease context be designed into the study, not added after the mechanism is established. Papers that are excellent basic biology with a disease speculation section are redirected to Cell, Molecular Cell, eLife, or PNAS, where the mechanism itself can be the primary story. The editorial message is not that the science is insufficient; it is that the disease angle is performative rather than integral.

Clinical study without mechanistic insight into why the finding occurs. JCI is not a clinical outcomes journal. The failure pattern is a large, well-designed clinical cohort study, clinical trial report, or epidemiological analysis reporting that patients with feature X have outcome Y, where the statistical analysis is rigorous and the clinical finding is genuine, but the mechanism explaining the association is not investigated. A paper showing that a biomarker level predicts relapse in a cancer cohort, or that patients with a specific genetic variant respond differently to a treatment, without mechanistic experiments establishing why the biomarker or variant creates that difference, is a clinical observation that belongs at JAMA, the Lancet, or NEJM. JCI wants to know not just what happens clinically but why it happens mechanistically.

Disease model that does not recapitulate human pathology convincingly. Post-review rejections at JCI frequently cite the artificial quality of the disease model relative to the human disease being claimed. The failure pattern is a paper using a mouse model (diet-induced, transgenic, or toxin-induced) where the model's relationship to human pathology is contested in the field, with mechanistic findings that are persuasive in the model but without human validation. JCI reviewers ask whether the mechanism operates in human disease, not just in the model. The journals own stated standards increasingly expect human evidence at submission, whether from patient biopsy analysis, patient-derived cell lines, or clinical cohort correlatives. Papers that arrive at review with animal-only data and propose human validation as future work are rejected more consistently today than they were five years ago. A JCI submission readiness check can assess whether the mechanistic depth and disease-model evidence meet JCI's dual standard.

Frequently asked questions

JCI's acceptance rate is approximately 10%. This includes both desk rejections and post-review rejections. Papers that pass the desk and enter review have roughly a 25-35% chance of acceptance.

JCI is selective, but the selectivity is specific. Papers need both mechanistic depth and genuine disease relevance. If your paper has both, your odds are better than the 10% headline suggests.

JCI desk-rejects 60-70% of submissions, typically within 2-3 weeks. The most common reason is that the paper has one of JCI's two requirements (mechanism or disease relevance) but not both.

JCI wants the mechanism to be the centerpiece with disease relevance built in. Nature Medicine wants the translational path to be the centerpiece. A paper about a newly discovered disease pathway fits JCI. A paper about moving a known mechanism toward therapy fits Nature Medicine.

References

Sources

  1. Clarivate Journal Citation Reports, 2024 JCR: JCI IF 13.6, Q1, rank 5/195 Medicine, General and Internal
  2. JCI information for authors
  3. JCI journal homepage
  4. American Society for Clinical Investigation

Before you upload

Want the full picture on Journal of Clinical Investigation?

Scope, selectivity, what editors want, common rejection reasons, and submission context, all in one place.

These pages attract evaluation intent more than upload-ready intent.

Anthropic Privacy Partner. Zero-retention manuscript processing.

Internal navigation

Where to go next

Open Journal of Clinical Investigation Guide