Journal Guides3 min readUpdated Mar 27, 2026

Bioinformatics Acceptance Rate

Bioinformatics's acceptance rate in context, including how selective the journal really is and what the number leaves out.

Author contextSenior Researcher, Molecular & Cell Biology. Experience with Molecular Cell, Nature Cell Biology, EMBO Journal.View profile

Journal evaluation

Want the full picture on Bioinformatics?

See scope, selectivity, submission context, and what editors actually want before you decide whether Bioinformatics is realistic.

Selectivity context

What Bioinformatics's acceptance rate means for your manuscript

Acceptance rate is one signal. Desk rejection rate, scope fit, and editorial speed shape the realistic path more than the headline number.

Full journal profile
Acceptance rate~40-50%Overall selectivity
Impact factor5.4Clarivate JCR
Time to decision~60-90 days medianFirst decision

What the number tells you

  • Bioinformatics accepts roughly ~40-50% of submissions, but desk rejection accounts for a disproportionate share of early returns.
  • Scope misfit drives most desk rejections, not weak methodology.
  • Papers that reach peer review face a higher bar: novelty and fit with editorial identity.

What the number does not tell you

  • Whether your specific paper type (review, letter, brief communication) faces the same rate as full articles.
  • How fast you will hear back — check time to first decision separately.
  • What open access publishing will cost if you choose that route.

Quick answer: there is no strong official Bioinformatics acceptance-rate number. OUP does not publish one. The real submission question is whether the method or tool fills a genuine gap, the code is publicly available, and benchmarks against current methods are honest and comprehensive. With an impact factor around 5.8, Bioinformatics is the field's flagship methods journal, but the editorial screen is about code, benchmarks, and gap-filling, not just algorithmic novelty.

If the code is not publicly available or benchmarks against current tools are missing, those gaps are the problem before the acceptance rate is.

How Bioinformatics' Acceptance Rate Compares

Journal
Acceptance Rate
IF (2024)
Review Model
Bioinformatics
~20-25%
5.4
Novelty
Genome Biology
~10-15%
9.4
Novelty
BMC Bioinformatics
~40-50%
2.9
Soundness
Nucleic Acids Research
~40-50%
13.1
Novelty
Briefings in Bioinformatics
~20-25%
7.7
Novelty

What you can say honestly about the acceptance rate

Oxford University Press does not publish an official acceptance rate for Bioinformatics.

Third-party estimates place the rate around 20-25%, making it one of the more selective computational biology venues. The journal receives a high volume of tool and method submissions, and the editorial team has decades of experience filtering for tools that actually work and fill genuine gaps.

What is stable is the editorial model:

  • all software tools must have publicly available source code at the time of submission
  • benchmarks against current state-of-the-art methods on real data are expected, not optional
  • the journal distinguishes between Original Papers (full methods) and Applications Notes (2-page tool announcements)
  • code quality, documentation, and installability are evaluated during review
  • the primary question is whether the method or tool is primary, not whether the biology is primary

That code-and-benchmarks requirement is the real structural filter. No public repository means no consideration, regardless of algorithmic elegance.

What the journal is really screening for

At triage, the editor is asking:

  • is the source code publicly available in a repository, not just "available upon request"?
  • has the tool been benchmarked against current competitors on real biological data, not just simulations?
  • does this method fill a genuine gap, or is it an incremental wrapper around existing software?
  • is the contribution methodological, or is this really a biology paper that happens to use computation?

A paper with a public repository, clear documentation, honest benchmarks on real data, and a genuine methodological contribution will survive triage more reliably than one with elegant algorithms but no code or comparisons.

The better decision question

For Bioinformatics, the useful question is:

Is the primary contribution a new computational method or tool that fills a genuine gap, with publicly available code and benchmarks against current state-of-the-art on real data?

If yes, Bioinformatics is the right fit. If the contribution is primarily a biological discovery enabled by computation, a domain-specific biology journal is more appropriate. If the tool is solid but not novel enough for Bioinformatics, BMC Bioinformatics is a reasonable alternative.

Where authors usually get this wrong

The common misses are:

  • submitting without a public code repository, which triggers immediate desk rejection
  • benchmarking against outdated methods from 2015 while ignoring current competitors
  • inflating a 2-page Applications Note into a full Original Paper by padding the methods section
  • describing a database or web portal without a meaningful computational method behind it
  • neglecting code documentation and installation instructions, which reviewers test during review

Those are code, benchmark, and format problems before they are rate problems.

What to use instead of a guessed percentage

If you are deciding whether to submit, these pages are more useful than an unofficial rate:

Together, they tell you whether the code and benchmarks are sufficient, whether Original Paper or Applications Note is the right format, and when Genome Biology or BMC Bioinformatics might be better targets.

Submit if / Think twice if

Submit if:

  • the method or tool fills a genuine gap: existing approaches cannot solve the problem the paper addresses, or the existing tools have documented limitations that the new method overcomes with demonstrated improvement
  • the code is publicly available before submission: a GitHub repository with installation instructions, example data, and documentation is the minimum standard
  • benchmarks against current state-of-the-art tools are included: the evaluation uses real biological datasets, compares against the best-performing existing methods under the same conditions, and reports performance honestly including cases where competitors perform comparably
  • the format matches the contribution: Original Paper for a novel algorithm with deep methodological development, Applications Note for a well-engineered tool that fills a practical gap

Think twice if:

  • the code is not publicly available or is available only on reasonable request: this is a desk rejection trigger
  • the benchmarks compare against old or weak tools rather than current best-performing methods: reviewers will flag selective benchmarking
  • BMC Bioinformatics is the more appropriate home for a solid applied tools paper without the methodological novelty Bioinformatics requires
  • the paper's primary contribution is a biological result using existing tools, not a new computational method: Genome Biology, PLoS Computational Biology, or the biology journal for the specific area is the right target

Readiness check

See how your manuscript scores against Bioinformatics before you submit.

Run the scan with Bioinformatics as your target journal. Get a fit signal alongside the IF context.

Check my manuscript fitAnthropic Privacy Partner. Zero-retention manuscript processing.Or sanity-check your reported stats

What Pre-Submission Reviews Reveal About Bioinformatics Submissions

In our pre-submission review work evaluating manuscripts targeting Bioinformatics, three patterns generate the most consistent desk rejections. Each reflects the journal's documented requirements: code availability, honest benchmarking, and genuine methodological contribution.

No publicly available code at submission. The Bioinformatics author instructions state explicitly that "software tools must be freely available" and that the source code must be deposited in a public repository. Papers submitted without a working, documented public implementation are desk-rejected. This is not a soft requirement. A link to a GitHub repository with installation instructions and at least one example or tutorial is the minimum. Papers where code availability is stated as "available upon reasonable request," "upon acceptance," or "in supplementary files" rather than in a public version-controlled repository fail the code standard. The journal's reviewer pool includes active computational biologists who will attempt to install and run the tool. Packages that cannot be installed from the repository following the provided instructions generate immediate rejection requests.

Benchmarks against outdated or weak methods. Bioinformatics reviewers are familiar with the current landscape of methods in the areas the paper covers. The failure pattern is a benchmarking section that compares the new method favorably against tools published more than 2-3 years ago without including the current state-of-the-art competitors. An RNA-seq differential expression tool benchmarked against DESeq version 1 and edgeR 2016 but not DESeq2 and edgeR 2023, a sequence alignment tool benchmarked against BWA-MEM but not BWA-MEM2 or minimap2, or a protein structure prediction tool evaluated only against methods predating AlphaFold2 signals to reviewers that the authors have not engaged with the current literature. Selective benchmarking that avoids the hardest comparisons is identified and generates major revision requests or rejection on the basis that the performance advantage has not been demonstrated under conditions that matter to current practitioners.

Method paper without biological application or biological results paper submitted as a methods paper. Bioinformatics has a specific scope: computational methods, tools, and algorithms for biological data analysis. The failure pattern comes in two forms. First, a paper presenting a new algorithm or tool without demonstrating that it produces biologically meaningful results on real data: a clustering algorithm that shows better silhouette scores on simulated data without demonstrating whether the clusters correspond to biologically meaningful cell populations or genomic features. Second, a paper presenting biological findings (new gene-disease associations, new mutational signatures, new regulatory networks) where the methods section describes the application of existing tools rather than a new computational contribution. The first type belongs in bioinformatics but needs real biological validation. The second type belongs in biology journals. A Bioinformatics submission readiness check can assess whether the computational contribution and biological validation are both present before submission.

Practical verdict

The honest answer to "what is the Bioinformatics acceptance rate?" is that OUP does not publish one, and third-party estimates should not be treated as precise.

The useful answer is:

  • yes, the journal is fairly selective among computational biology venues
  • no, a guessed percentage is not the right planning tool
  • use code availability, benchmark honesty, and genuine methodological contribution as the real filter instead

If you want help pressure-testing whether this manuscript meets Bioinformatics' code and benchmarking standards before upload, a Bioinformatics submission readiness check is the best next step.

What the acceptance rate does not tell you

The acceptance rate for Bioinformatics does not distinguish between desk rejections and post-review rejections. A paper desk-rejected in 2 weeks and a paper rejected after 4 months of review both count the same. The rate also does not reveal how acceptance varies by article type, geographic origin, or research area within the journal's scope.

Acceptance rates cannot predict your individual odds. A strong paper with clear scope fit, complete data, and solid methodology has substantially better odds than the headline number suggests. A weak paper with methodology gaps will be rejected regardless of the journal's overall rate.

A Bioinformatics submission readiness check identifies the specific framing and scope issues that trigger desk rejection before you submit.

Before you submit

A Bioinformatics desk-rejection risk check scores fit against the journal's editorial bar.

Frequently asked questions

No. Oxford University Press does not release official acceptance-rate figures for Bioinformatics. Third-party estimates in the 20-25% range suggest the journal is fairly selective, but no exact figure is publisher-confirmed. The useful planning question is whether the method or tool fills a genuine gap and the code is publicly available with honest benchmarks.

Code availability and benchmarking. Bioinformatics requires publicly available source code and expects benchmarks against current state-of-the-art tools on real data. A tool without a public repository will be desk-rejected. A tool without comparative benchmarks will not survive review.

The 2025 JCR impact factor is approximately 5.8. Bioinformatics holds Q1 status in Mathematical and Computational Biology and is the default destination for computational biology tool and method papers.

Original Papers are full-length method articles requiring deep algorithmic novelty and extensive benchmarking. Applications Notes are 2-page tool announcements judged on usability and gap-filling rather than methodological depth. If you have a well-engineered tool but not a novel algorithm, Applications Notes is the right format.

References

Sources

  1. 1. Bioinformatics journal page, Oxford University Press.
  2. 2. Bioinformatics author guidelines, OUP.
  3. 3. Clarivate Journal Citation Reports, 2025 edition (IF ~5.8).
  4. 4. SCImago Journal & Country Rank: Bioinformatics, Q1 ranking.

Before you upload

Want the full picture on Bioinformatics?

Scope, selectivity, what editors want, common rejection reasons, and submission context, all in one place.

These pages attract evaluation intent more than upload-ready intent.

Anthropic Privacy Partner. Zero-retention manuscript processing.

Internal navigation

Where to go next

Open Bioinformatics Guide