How to Avoid Desk Rejection at Sensors
The editor-level reasons papers get desk rejected at Sensors, plus how to frame the manuscript so it looks like a fit from page one.
Senior Researcher, Oncology & Cell Biology
Author context
Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.
Desk-reject risk
Check desk-reject risk before you submit to Sensors.
Run the Free Readiness Scan to catch fit, claim-strength, and editor-screen issues before the first read.
What Sensors editors check before sending to review
Most desk rejections trace to scope misfit, framing problems, or missing requirements — not scientific quality.
The most common desk-rejection triggers
- Scope misfit — the paper does not match what the journal actually publishes.
- Missing required elements — formatting, word count, data availability, or reporting checklists.
- Framing mismatch — the manuscript does not communicate why it belongs in this specific journal.
Where to submit instead
- Identify the exact mismatch before choosing the next target — it changes which journal fits.
- Scope misfit usually means a more specialized or broader venue, not a lower-ranked one.
- Sensors accepts ~~50-60% overall. Higher-rate journals in the same field are not always lower prestige.
How Sensors is likely screening the manuscript
Use this as the fast-read version of the page. The point is to surface what editors are likely checking before you get deep into the article.
Question | Quick read |
|---|---|
Editors care most about | Novel sensing platform or approach with demonstrated detection capability |
Fastest red flag | Demonstrating analyte detection in pure solutions without real-sample testing |
Typical article types | Article, Review, Short Note |
Best next step | Manuscript preparation |
Quick answer: Your sensor paper needs real-sample testing and complete characterization data before submission. Sensors isn't looking for proof-of-concept studies or buffer-solution demonstrations. The journal wants sensors that work in realistic conditions with full performance metrics documented.
Most authors miss this standard and get rejected at editorial screening because they're solving the wrong problem. They optimize sensitivity in clean solutions instead of proving their sensor handles interference, maintains stability, and detects targets in actual samples. That disconnect kills submissions before peer review starts.
The decision framework is simple but unforgiving. If your sensor only works in laboratory buffer solutions, don't submit yet. If you have sensitivity data but no selectivity studies, you're not ready. Sensors editors can spot incomplete characterization in the abstract, and they reject papers that try to pass off preliminary studies as complete sensor development.
In our pre-submission review work with Sensors submissions
We see Sensors desk rejections cluster around one repeated pattern: the authors optimized signal quality in clean conditions and treated that as proof of sensor readiness. Editors usually care more about whether the device survives real samples, interference, and drift than about one attractive limit-of-detection number in buffer.
We also see papers lose credibility when the characterization package feels incomplete for the claimed application. If the manuscript promises clinical, environmental, or food-safety relevance but skips stability, reproducibility, or matrix effects, the submission starts to look like a lab demonstration rather than a deployable sensor paper.
Common desk rejection reasons at Sensors
Reason | How to avoid |
|---|---|
Sensor tested only in buffer solutions | Validate in real samples: blood serum, river water, food matrices |
Missing selectivity or interference studies | Test against common interferents relevant to the target application |
No stability or shelf-life data | Include long-term stability, reproducibility, and operational lifetime data |
Incomplete analytical characterization | Report sensitivity, selectivity, stability, reproducibility, and limit of detection |
Missing mechanism understanding | Explain why the sensing approach works at a molecular or materials level |
Scope mismatch | Submit fundamental detection-principle studies to chemistry or materials journals instead |
No advancement over existing methods | Demonstrate clear practical advantages over commercial sensors or published methods |
MDPI's editorial model and what it means for Sensors
Sensors operates under MDPI's rapid editorial model, which means desk decisions typically come within 2-4 weeks. This speed works against authors who submit incomplete work hoping for constructive reviewer feedback, the paper gets rejected before it reaches reviewers.
MDPI journals, including Sensors, accept diverse sensor types: biosensors, chemical sensors, physical sensors, smart sensors for IoT applications. The diversity suggests broad acceptance criteria, but every sensor type must meet the same practical implementation standard. Whether you're detecting glucose or monitoring structural vibrations, your sensor needs real-sample testing and complete characterization.
Authors sometimes misread Sensors as a lower-bar venue because of MDPI's volume publishing model. That's a mistake. The journal's 3.5 impact factor reflects genuine selectivity on practical sensor development, and the editorial board includes active researchers who recognize incomplete characterization immediately.
Why buffer-only studies get rejected
Sensors publishes applied sensor technology, not fundamental studies of molecular recognition or signal transduction. The editorial filter catches authors off guard because it's about scope, not science quality.
Consider two glucose sensor papers. Paper A: new electrochemical sensor with 0.1 mM detection limit in phosphate buffer, stable response over 100 measurements, detailed electrochemical characterization. Paper B: similar sensor with 0.5 mM detection limit tested in human serum, interference studies with fructose and galactose, 30-day stability data at room temperature.
Paper B gets accepted despite worse detection limits because it demonstrates practical sensor performance. Paper A gets desk rejected because it's a proof-of-concept study disguised as sensor development.
The journal's readership develops sensor technology for real applications, medical diagnostics, environmental monitoring, food safety, industrial process control. They need sensors that work outside the laboratory. The impact factor of 3.5 positions Sensors competitively, but acceptance requires meeting practical implementation standards regardless of sensor type.
What real-sample testing actually requires
Testing your cancer biomarker sensor in diluted human serum isn't sufficient. Testing your pesticide detector in spiked river water doesn't prove environmental monitoring capability. Real-sample testing means actual samples from the intended application environment without artificial simplification.
Real samples introduce complications absent in controlled conditions:
- Biological samples: Proteins that foul sensor surfaces, salt concentrations that shift electrochemical baselines, pH variations that affect binding kinetics
- Environmental samples: Particulates that block optical signals, competing ions that interfere with selective recognition, organic matter that changes surface chemistry
- Dynamic effects: Response time increases in viscous biological fluids, detection limit degradation from sample turbidity, signal drift from active biochemical processes
A heavy metal sensor might work perfectly in laboratory solutions with cadmium, mercury, and lead at known concentrations. But groundwater contains natural organic acids that complex with metals, bacterial biofilms that modify electrode surfaces, and ionic strength variations that affect mass transport.
Editors spot authentic real-sample data because it shows more variability, requires more statistical analysis, and includes discussion of matrix effects. Clean laboratory data is suspiciously consistent.
Most sensor papers fail here because authors test simplified versions of real samples: filtered biological fluids, synthetic mixtures designed to mimic complexity, or spiked solutions that approximate real conditions. These approaches miss the unpredictable interactions that define sensor performance in actual use.
Testing in real samples also reveals practical limitations invisible in controlled studies. Response time might increase when sensors encounter viscous biological fluids. Detection limits might degrade when dealing with sample turbidity or color interference. Signal drift might accelerate in samples with active biochemical processes.
Smart sensor development tests crude prototypes in actual samples early to identify failure modes before optimizing sensor design. This prevents months of optimization producing laboratory performance that doesn't translate.
The selectivity gap editors spot immediately
Reporting that your glucose sensor doesn't respond to fructose isn't sufficient selectivity analysis. Sensors editors want comprehensive interference studies reflecting the complexity of real analytical environments.
Complete selectivity characterization requires:
- Testing against all molecules reasonably present in target samples at realistic concentrations
- Quantitative interference coefficients and selectivity ratios, not qualitative "no interference" statements
- Detection limit changes in the presence of common interferents across their concentration range
- Error bars showing measurement uncertainty
- Discussion of how interference effects change with sensor aging or surface modification
For example, dopamine sensors for neurochemical monitoring must account for ascorbic acid, uric acid, serotonin, and norepinephrine, often present at higher concentrations than dopamine itself. Testing each compound individually and in mixtures that represent realistic neurochemical environments is the minimum.
Authors often try to shortcut selectivity studies by testing only a few obvious interferents or by testing at concentrations that don't reflect real sample conditions. This produces incomplete characterization that editors recognize immediately. If your glucose sensor claims good selectivity but you only tested fructose and sucrose at millimolar concentrations, you haven't characterized selectivity for clinical glucose monitoring where multiple interfering compounds exist at various concentration levels.
Smart selectivity studies also consider dynamic interference effects. Some interfering compounds don't produce false positive signals but reduce sensor sensitivity over time through surface fouling or competitive binding. These effects only become apparent during extended testing periods or repeated measurements in complex samples.
Selectivity requirements connect directly to real-sample testing because interference effects often amplify in complex matrices. A sensor might show good selectivity in buffer but suffer significant interference when the same compounds appear in biological fluids with different pH, ionic strength, or protein content.
Desk-reject risk
Run the scan while Sensors's rejection patterns are in front of you.
See whether your manuscript triggers the patterns that get papers desk-rejected at Sensors.
Sensors vs. IEEE Sensors Journal: choosing the right venue
Authors often submit to Sensors (MDPI) when IEEE Sensors Journal would be a better fit, or vice versa. The two journals have different editorial priorities despite overlapping scope.
Criterion | Sensors (MDPI) | IEEE Sensors Journal |
|---|---|---|
Core priority | Practical sensor implementation with real-sample data | Technical sensor innovation with rigorous engineering analysis |
Real-sample testing | Required for most paper types | Valued but not always required for novel transduction mechanisms |
Theoretical depth | Expected but secondary to practical demonstration | Can carry a paper if engineering contribution is strong |
Review speed | 2-4 weeks to desk decision | 4-8 weeks typical |
Impact factor | ~3.5 | ~4.3 |
Open access | Yes (APC required) | Hybrid (OA optional) |
Best fit for | Complete sensor development with field/clinical validation | Novel sensing mechanisms, signal processing advances, sensor system design |
If your paper is strong on novel transduction physics but light on real-sample validation, IEEE Sensors Journal may be more receptive. If your paper demonstrates complete practical sensor development with real-world testing, Sensors is the natural home.
Submit if your sensor does these 3 things
- Reliable detection in realistic samples. Consistent, measurable responses to target analytes in actual samples from the intended application environment. Not spiked solutions. Not synthetic mixtures. Your sensor maintains detection capability across the concentration range needed for practical applications.
- Complete analytical characterization. The full performance profile: sensitivity, selectivity, detection limits, linear range, response time, stability, interference studies, calibration curves in realistic matrices, and error analysis showing measurement uncertainty.
- Reproducible fabrication. Multiple sensor batches producing similar performance characteristics with acceptable variation. You understand which fabrication parameters affect performance and can document protocols that produce consistent results.
These criteria aren't just editorial preferences. They reflect practical requirements for sensor technology that could actually be used for its intended purpose. Clinical diagnostic sensors need regulatory approval, which requires extensive validation in patient samples with full analytical characterization. Environmental monitoring sensors need field deployment capability, which requires robust fabrication and known performance boundaries.
Most importantly, your sensing mechanism should be understood well enough to explain why the sensor works and predict how performance might change under different conditions. This doesn't require complete theoretical modeling, but it does require understanding the fundamental chemistry or physics that generates sensor signals.
Don't delay submission trying to optimize detection limits if you already have complete data. Sensors values practical demonstration over academic perfectionism. A working sensor with understood limitations beats an optimized sensor that only works under controlled conditions.
Hold off if you're missing these
- Stability studies: Long-term performance data showing how sensor response changes over time under storage and operating conditions. Shelf-life data at different temperatures. Operational stability over hundreds of measurement cycles.
- Mechanism understanding: You can explain why your sensor produces measurable signals. Surface binding kinetics, electron transfer mechanisms, optical property changes. Enough mechanistic insight to predict sensor behavior.
- Practical implementation data: Response time measurements, sample volume requirements, and detection protocols that could realistically be used by intended end users.
Missing any of these signals incomplete sensor development. The journal doesn't publish proof-of-concept studies, and editorial screening catches incomplete characterization quickly.
Consider whether you're trying to publish too early in the sensor development process. Many authors submit papers based on initial promising results before they've fully characterized sensor performance or demonstrated practical implementation. This usually leads to desk rejection because the manuscript presents preliminary data as complete sensor development. Better to delay submission until you have complete characterization than to submit prematurely and face rejection.
The review process at Sensors is fast enough that weak manuscripts get filtered early. If your paper has obvious gaps in characterization or testing, expect rapid desk rejection rather than extended peer review. Focus on strengthening the weakest aspects of your sensor characterization before submission. If you have good sensitivity and selectivity data but limited stability studies, invest time in long-term testing. If you have complete analytical data but limited real-sample testing, prioritize practical validation experiments.
Final Sensors fit check before you submit
- Validate the sensor in real samples rather than only in idealized or spiked laboratory mixtures
- Report the complete analytical profile, including stability, interference, and operating limits
- Explain the sensing mechanism well enough to predict behavior under realistic conditions
- Show batch-to-batch or protocol-level reproducibility rather than one-off prototype success
- Make the implementation case practical for the intended user or setting
- Choose Sensors only if the paper still looks like complete sensor development after the best lab-only result is stripped away
A Sensors desk-rejection risk check can flag the desk-rejection triggers covered above before your paper reaches the editor.
Frequently asked questions
Sensors (MDPI) filters submissions that lack real-sample testing or complete characterization data. Papers demonstrating proof-of-concept in buffer solutions only are rejected at editorial screening.
The most common reasons are sensors only tested in laboratory buffer solutions without real samples, missing selectivity or interference studies, no shelf-life or stability data, incomplete characterization lacking sensitivity, selectivity, stability, and reproducibility metrics, and missing mechanism understanding of why the sensing approach works.
Sensors editors make editorial decisions relatively quickly due to MDPI's rapid editorial model, typically within 2-4 weeks of submission.
Editors want two core elements: practical sensor performance in real samples such as blood serum, river water, or food matrices, and complete analytical characterization including sensitivity, selectivity, stability, reproducibility, interference studies, and mechanism understanding.
Sources
Final step
Submitting to Sensors?
Run the Free Readiness Scan to see score, top issues, and journal-fit signals before you submit.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Start here
Same journal, next question
- Sensors and Actuators B Chemical Submission Guide
- Sensors submission process
- Is Your Paper Ready for Sensors? MDPI's Cross-Disciplinary Sensing Journal
- Sensors Impact Factor 2026: 3.5, Q2, Rank 24/79
- Is Sensors a Good Journal? Fit Verdict
- Sensors APC and Open Access: CHF 2,600, Discounts, and Whether the Fee Makes Sense
Supporting reads
Conversion step
Submitting to Sensors?
Anthropic Privacy Partner. Zero-retention manuscript processing.