Remote Sensing Acceptance Rate
Remote Sensing publishes ~6,000 articles per year with Q1 ranking in Earth Sciences. Here is what the acceptance rate data tells you.
Senior Researcher, Chemistry
Author context
Specializes in manuscript preparation and peer review strategy for chemistry journals, with deep experience evaluating submissions to JACS, Angewandte Chemie, Chemical Reviews, and ACS-family journals.
Journal evaluation
Want the full journal picture?
See scope, selectivity, submission context, and what editors actually want before you decide whether the journal is realistic.
Quick answer: Remote Sensing (MDPI) publishes roughly 6,000 articles per year, ranks Q1 in Earth Sciences with an IF of 4.1, and delivers first decisions in 3-6 weeks. The estimated acceptance rate of 45-55% is high by traditional standards but consistent with MDPI's open-access model. The name similarity with Remote Sensing of Environment (Elsevier, IF ~13.5) causes confusion, but these journals occupy different tiers with fundamentally different editorial philosophies. The submission decision is less about beating selectivity and more about whether Remote Sensing or a more selective venue is the right home for your work.
What you can say honestly about the acceptance rate
MDPI publishes journal-level statistics including decision times and article volumes. Remote Sensing's annual output of ~6,000 papers, combined with community data from LetPub and SciRev, supports an estimated acceptance rate of 45-55%. Desk rejection runs lower than traditional journals, probably 15-25%, because MDPI's model pushes more papers into review rather than filtering heavily at triage.
This rate is genuinely different from the field's top venues. Remote Sensing of Environment accepts roughly 15-20% and publishes about 600 papers per year. IEEE Transactions on Geoscience and Remote Sensing sits at 25-30% with about 1,200 papers annually. ISPRS Journal of Photogrammetry and Remote Sensing is similarly selective at ~15-20% with about 400 papers per year. Remote Sensing (MDPI) is not competing with these journals on selectivity. It filters for technical soundness and scope fit rather than demanding field-advancing contributions.
The 3-6 week first decision is real and valuable. Academic editors (volunteer researchers, not professional staff) handle initial assessment quickly, and MDPI's system gives reviewers tight 10-14 day deadlines. For comparison, RSE, IEEE TGRS, and ISPRS JPRS typically take 3-6 months for a first decision. For PhD students facing defense deadlines or researchers on grant reporting timelines, this speed difference is a genuine strategic advantage that should be weighed alongside selectivity and prestige considerations.
The journal's scope is broad: satellite imagery analysis, SAR, LiDAR, photogrammetry, GIS, hyperspectral imaging, UAV sensing, atmospheric correction algorithms, land use classification, crop monitoring, ocean color remote sensing, and disaster mapping. That breadth is part of why the volume is so high. Scope-based desk rejection is rare; if your work involves remote sensing data or methods in any meaningful way, it is probably within scope.
What the journal is really screening for
Academic editors check scope, basic scientific soundness, and whether the paper reads as research rather than a consulting report or technical memo. Two external reviewers then evaluate the work under single-blind review. The APC is approximately 2,600 CHF (~$2,900 USD) with no subscription-track alternative.
Reviewers screen for three things above all.
First, validation against established benchmarks: if you are proposing a new method, compare it to at least two or three published approaches on a recognized dataset. An enormous number of submissions skip this step, and it is one of the most common reasons for rejection even at a journal with a 45-55% acceptance rate.
Second, proper description of study areas and data: sensor type, acquisition dates, preprocessing steps, spatial context, and justification for why that particular location matters. Papers from researchers who are domain experts (forestry, urban planning, hydrology) but are not used to remote sensing journal conventions often stumble here.
Third, honest treatment of accuracy metrics. A land cover classification reporting 95% overall accuracy without discussing class imbalance or per-class performance will get flagged by any experienced reviewer. Report balanced accuracy, F1 per class, or kappa alongside overall accuracy.
Papers that are pure application without any methodological contribution face the tightest scrutiny. Running an existing classifier on satellite data for a study area and reporting overall accuracy is a technical report, not a research paper. Even at Remote Sensing, reviewers expect something new: a modified algorithm, a novel data fusion approach, a new validation framework, or at least a genuinely under-studied application domain that fills a gap in the literature.
The journal also receives an enormous number of deep learning papers applying standard architectures (U-Net, ResNet, transformer variants) to satellite imagery. If your contribution is "we applied DeepLabV3+ to our study area," that is not sufficient. Explain exactly what architectural modifications you made and why they matter for remote sensing data specifically.
The better decision question
For Remote Sensing, the real decision is about career positioning and venue fit. Three questions determine whether this is the right choice.
First, could this paper realistically land at a more selective venue? If it presents a genuinely new algorithm, a large-scale global analysis, or a methodological advance that the community would cite heavily, it belongs at RSE, IEEE TGRS, or ISPRS JPRS. Submitting it to Remote Sensing means faster publication but less citation traction and less recognition from hiring committees. A paper that could get into RSE but ends up here will not achieve the same impact. Do not submit to Remote Sensing if you are settling.
Second, is the APC compatible with your funding? At approximately 2,600 CHF, every accepted paper requires payment. Unlike hybrid journals where subscription-track publishing costs nothing, there is no free route at Remote Sensing. Check whether your institution has an MDPI agreement before assuming you are paying full price. MDPI also offers waivers on a case-by-case basis for authors from low-income countries.
Third, is the paper a solid regional case study, an incremental improvement on an existing method, or a preliminary proof-of-concept? If so, Remote Sensing is a legitimate, indexed home for it. The Q1 ranking and IF of 4.1 provide real indexing value, the papers get cited, and the turnaround is hard to beat in the Earth observation field. International Journal of Applied Earth Observation and Geoinformation (JAG, IF ~7.6, ~25-30% acceptance) is worth considering as a middle-ground alternative: more selective and higher-impact than Remote Sensing, but faster and more application-friendly than RSE or IEEE TGRS.
Where authors usually get this wrong
The most common mistake is submitting application work with no methodological contribution and no novel study context. A paper that applies an off-the-shelf method to a standard dataset for a well-studied region does not clear the bar even at a 45-55% acceptance rate journal.
The second mistake is treating a special issue invitation email as a meaningful signal about paper fit. MDPI special issues generate a large fraction of Remote Sensing's submissions, and invitations go out broadly to anyone with a tangentially related publication. Evaluate the guest editor's track record and check what other papers have already been published in the collection. Some special issues are rigorously curated. Others are not.
The third mistake is inflating accuracy claims. Reviewers who handle remote sensing papers daily can spot inflated metrics instantly. An overall accuracy of 95% on a binary classification where 90% of pixels belong to one class is not impressive; it is expected from a naive classifier. Report balanced accuracy, F1 scores per class, or kappa alongside overall accuracy, and make sure your abstract's claims match what your results section actually demonstrates.
The fourth mistake is the opposite error: submitting to Remote Sensing when the paper genuinely belongs at RSE or IEEE TGRS. If your paper presents a real methodological advance, try the higher venue first. The timeline is longer, but the citation impact and career return are substantially greater.
What to use instead of a guessed percentage
Check the MDPI statistics page for Remote Sensing for decision times and volume data.
Scan recent publications in your sub-area. If you see papers with similar scope, sensor data, and methodological depth published regularly, your submission is within range. If everything in your niche goes to IEEE TGRS, RSE, or JAG, that tells you where your community reads and evaluates work.
You can also check the journal's special issue listings. If a well-curated special issue in your exact topic area is currently accepting submissions, that can be a good entry point, provided the guest editor has a strong track record.
Practical verdict
Remote Sensing is a legitimate Q1 Earth Sciences journal with fast turnaround and a moderate selectivity bar. The acceptance rate of ~45-55% reflects a model that publishes solid, technically sound work without demanding field-defining novelty.
For researchers who need fast, indexed, open-access publication of earth observation work, it is one of the strongest options available. For researchers whose work represents a genuine methodological advance, try RSE, IEEE TGRS, or ISPRS JPRS first.
If your remote sensing research includes proper validation against benchmarks, honest accuracy reporting, and a clear contribution statement, prepare your manuscript and submit. A pre-submission manuscript check can flag missing benchmarks, validation gaps, and formatting issues before your paper enters the queue.
Sources
- MDPI, Remote Sensing journal statistics (decision times, ~6,000 articles/year)
- Clarivate Analytics, Journal Citation Reports 2024 (JIF 4.1, Q1 Geosciences)
- SCImago Journal & Country Rank, Remote Sensing
- Scopus CiteScore, Remote Sensing (CiteScore ~8.0)
- MDPI APC information (~2,600 CHF)
- LetPub and SciRev community-reported review and acceptance data
Reference library
Use the core publishing datasets alongside this guide
This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.
Dataset / reference guide
Peer Review Timelines by Journal
Reference-grade journal timeline data that authors, labs, and writing centers can cite when discussing realistic review timing.
Dataset / benchmark
Biomedical Journal Acceptance Rates
A field-organized acceptance-rate guide that works as a neutral benchmark when authors are deciding how selective to target.
Reference table
Journal Submission Specs
A high-utility submission table covering word limits, figure caps, reference limits, and formatting expectations.
Before you upload
Want the full journal picture?
Scope, selectivity, what editors want, common rejection reasons, and submission context, all in one place.
These pages attract evaluation intent more than upload-ready intent.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Supporting reads
Conversion step
Want the full journal picture?
These pages attract evaluation intent more than upload-ready intent.