Is Your Paper Ready for Remote Sensing (MDPI)? An Honest Pre-Submission Checklist
Remote Sensing (MDPI) accepts 40-45% of submissions with an IF of ~4.2 and a $2,700 APC. This guide covers scope, review speed, and how it compares with RSE and IEEE TGRS.
Senior Researcher, Oncology & Cell Biology
Author context
Specializes in manuscript preparation and peer review strategy for oncology and cell biology, with deep experience evaluating submissions to Nature Medicine, JCO, Cancer Cell, and Cell-family journals.
Next step
Choose the next useful decision step first.
Use the guide or checklist that matches this page's intent before you ask for a manuscript-level diagnostic.
Remote Sensing sits in a peculiar spot in the earth observation journal landscape. It's not the most prestigious remote sensing journal, that title belongs to Remote Sensing of Environment. It doesn't carry the engineering credibility of IEEE Transactions on Geoscience and Remote Sensing. But it has become one of the most widely published and frequently cited journals in the field, processing over 5,000 papers a year with review turnarounds that most competitors can't match. If you're weighing whether to submit here, the decision isn't really about quality thresholds. It's about understanding what this journal is and isn't, and whether that fits your manuscript and your career strategy.
Remote Sensing by the numbers
Remote Sensing (MDPI) publishes roughly 5,000+ papers annually with an acceptance rate of 40-45%, an Impact Factor around 4.2, and a typical first decision within 2-4 weeks. It's fully open access with an APC of approximately $2,700.
Metric | Value |
|---|---|
Impact Factor (2024 JCR) | ~4.2 |
CiteScore | ~8.3 |
Acceptance rate | ~40-45% |
Annual publications | 5,000+ |
Time to first decision | 2-4 weeks |
Total time to publication | 6-10 weeks |
Article processing charge | ~$2,700 |
Open access model | Fully OA (gold) |
Peer review type | Single-blind |
Indexed in | Web of Science, Scopus |
Publisher | MDPI (Basel, Switzerland) |
Those numbers tell a story. The IF of 4.2 is respectable but won't turn heads on a tenure application in a top-20 geography department. The 40-45% acceptance rate is generous compared to the major competitors. And 5,000+ papers per year is a volume that no traditional society journal comes close to matching.
The MDPI question: let's address it directly
You can't write honestly about Remote Sensing without talking about MDPI. The publisher has been the subject of ongoing debate in academic circles, and it's fair to say opinions are split.
Here's what's true: MDPI journals have been accused of prioritizing volume over selectivity. The invitation-to-submit emails that many researchers receive feel spammy. Some special issues have drawn criticism for accepting papers that wouldn't have passed review at more selective venues. MDPI was briefly placed on Beall's list of predatory publishers in 2014, though it was later removed, and most bibliometric experts don't consider it predatory today.
Here's what's also true: Remote Sensing is indexed in Web of Science and Scopus. It has a legitimate Impact Factor tracked by Clarivate. Papers published here get cited and used by other researchers. Many strong research groups publish here regularly, and the journal has built genuine topical authority in areas like agricultural remote sensing, urban monitoring, and SAR applications.
My honest take: Remote Sensing isn't a predatory journal, but it's not above criticism either. The editorial bar varies across special issues and handling editors. You should treat this as a mid-tier journal that's good for solid applied work, not as a home for your most career-defining study, and not as a journal to avoid out of principle.
What Remote Sensing actually publishes
The scope is intentionally broad: anything involving the acquisition, processing, or application of remotely sensed data. That includes satellite imagery, aerial surveys, drone-based sensing, LiDAR, radar (SAR and InSAR), hyperspectral imaging, thermal sensing, and atmospheric remote sensing.
In practice, the papers that do well here tend to fall into a few categories:
Applied classification and mapping studies. Land use/land cover mapping using Sentinel-2, crop type classification, urban expansion monitoring. These make up a substantial fraction of the journal's output. If you've developed a workflow that applies machine learning to satellite imagery for a specific application, this is a natural fit.
Sensor calibration and validation. Cross-sensor comparisons, atmospheric correction algorithm assessments, and ground truth validation campaigns. These papers are technically necessary for the field but often too specialized for high-impact journals.
Method development with regional applications. You've built a new change detection algorithm and tested it over three study sites. The method is solid but it's not going to redefine the field. Remote Sensing will be interested; Remote Sensing of Environment probably won't be.
Review and tutorial papers. The journal publishes a high volume of reviews, often through special issues. If you've been invited to contribute to a special issue, that's a common entry point.
The special issue ecosystem
This is something that sets MDPI apart from traditional publishers, and it's worth understanding before you submit. A large portion of Remote Sensing's output comes through special issues, themed collections organized by guest editors who recruit papers on specific topics.
Special issues aren't inherently bad. They can bring together related work into a coherent collection. But the quality control varies. Some guest editors run tight ships with standards comparable to regular submissions. Others are less discriminating, and the invitation-to-submit model can attract papers that wouldn't have been submitted otherwise.
If you've been invited to submit to a special issue, check the guest editors' track records. Are they established researchers in the topic area? Does the special issue have a clear, specific scope, or is it so broad that almost anything fits? A special issue titled "Advances in Remote Sensing of Vegetation" could mean almost anything, while "SAR-Based Soil Moisture Retrieval Under Dense Canopy Cover" suggests the editors have a clear vision.
Regular submissions (not through special issues) go through the journal's standard editorial workflow and tend to face more consistent review standards.
How Remote Sensing compares to the competition
This is where the strategic thinking matters most. Here's how the landscape breaks down:
Factor | Remote Sensing (MDPI) | Remote Sensing of Environment | ISPRS Journal | IEEE TGRS | Int. J. Applied Earth Observation |
|---|---|---|---|---|---|
Impact Factor (2024) | ~4.2 | ~13.5 | ~12.7 | ~7.5 | ~7.6 |
Acceptance rate | ~40-45% | ~15-20% | ~20-25% | ~25-30% | ~25-30% |
Review time | 2-4 weeks | 3-6 months | 2-5 months | 3-6 months | 2-4 months |
APC / model | ~$2,700 (full OA) | ~$4,000 (hybrid) | ~$3,500 (hybrid) | Subscription | ~$3,400 (hybrid) |
Annual output | 5,000+ | ~1,200 | ~800 | ~1,500 | ~600 |
Sweet spot | Applied studies, regional analysis | Large-scale environmental science | Photogrammetry and spatial methods | Signal processing and sensor engineering | Geoinformation applications |
A few things stand out from this comparison.
Remote Sensing of Environment vs. Remote Sensing (MDPI). Despite the similar names, these journals aren't in the same tier. RSE (IF ~13.5) is the top journal in the field. It wants papers with global implications, novel environmental insights, and methodological contributions that move the discipline forward. If your paper uses Sentinel-2 to map land cover in one province of one country, RSE won't be interested regardless of how good your classification accuracy is. That paper might work well at MDPI's Remote Sensing.
IEEE TGRS vs. Remote Sensing (MDPI). IEEE TGRS (IF ~7.5) is the default venue for signal processing and algorithm-focused remote sensing work. It's more technically rigorous and more selective, but it also takes 3-6 months for a first decision. If your paper is primarily about a new algorithm and you can wait for the review cycle, TGRS is the stronger choice. If it's more applied and you need a faster turnaround, Remote Sensing makes sense.
ISPRS Journal vs. Remote Sensing (MDPI). ISPRS carries strong authority in photogrammetry, point cloud processing, and spatial data methods. It's significantly more selective and slower, but a publication there carries more weight with specialists.
The review process: what to expect
MDPI's review model is built for speed. Here's the typical timeline:
- Editorial check: 1-3 days (format and scope screening)
- Peer review: 2-4 weeks (usually 2-3 reviewers)
- First decision: 2-4 weeks from submission
- Revision period: 5-10 days (MDPI sets tight revision deadlines)
- Second review: 1-2 weeks
- Production to publication: 1-2 weeks
- Total time: 6-10 weeks from submission to online publication
That speed is a double-edged sword. On one hand, you won't wait six months wondering whether your paper has been read. On the other hand, the tight timelines mean reviewers don't always have time for deep engagement. You'll often receive brief reports, a paragraph or two per reviewer rather than the multi-page critiques you'd see at RSE or IEEE TGRS.
The revision deadlines deserve special attention. MDPI typically gives you 5-10 days to revise, which is dramatically shorter than the 30-60 days most traditional journals allow. If your reviewers request additional experiments or significant reanalysis, that timeline can be stressful. You can request an extension, but don't assume it'll be granted automatically.
Specific failure modes at this journal
Not every rejection at Remote Sensing is about quality. Some are about fit and formatting.
The "Google Earth Engine script" paper. You've written a GEE script that classifies land cover over your study area, achieved 89% accuracy, and written it up. This is the most common type of submission Remote Sensing receives, and it's also the type most likely to be rejected for insufficient novelty. If your paper doesn't introduce a methodological contribution beyond applying existing classifiers to a new area, even Remote Sensing's editors will push back.
Accuracy metrics without uncertainty. Reporting overall accuracy and kappa without confidence intervals, per-class performance metrics, or spatial autocorrelation analysis. Reviewers at this journal have become more demanding about statistical rigor in accuracy assessment.
The special issue mismatch. Submitting to a special issue because you received an invitation email, even though your paper only tangentially fits the topic. Guest editors may accept it, but regular reviewers will flag the scope mismatch.
Missing data availability statement. MDPI requires a data availability statement and encourages data deposition. Papers that claim "data available on request" without justification increasingly face pushback from reviewers.
Honest self-assessment before submitting
Work through these questions:
Is your method genuinely new, or are you applying known methods to a new area? Both can work at Remote Sensing, but the framing needs to be different. A purely regional application needs strong local significance and thorough validation. A methodological paper needs to demonstrate something that didn't exist before.
Can your study area and results interest readers outside your region? A paper about flood mapping in Bangladesh using Sentinel-1 can interest readers worldwide if you frame it around transferable methods. The same study framed purely as "flood mapping in District X" won't attract readership.
Have you compared your approach against standard baselines? Don't just report your method's accuracy. Show how it performs against random forests, SVMs, U-Net, or whatever the accepted baseline is for your task. Papers without comparison experiments are the easiest to reject.
Is your imagery and preprocessing chain fully documented? Atmospheric correction method, cloud masking, temporal compositing, reviewers expect these to be specified precisely, not glossed over with "standard preprocessing was applied."
Are you comfortable with the APC? $2,700 isn't trivial, especially for researchers in countries where that exceeds a monthly salary. Factor this into your decision. If your institution doesn't cover APCs, the hybrid options at RSE or IEEE TGRS might work out cheaper, especially if you can use the green OA route (preprint archiving).
Making the most of the submission
If you've decided Remote Sensing is the right target, a few practical tips will help.
Choose regular submission over special issue invitations you didn't seek out. Unless the special issue is in your exact area and the guest editors are people you'd cite, the standard submission route tends to produce more consistent reviews.
Write a clear, direct abstract. MDPI papers are fully open access, which means your abstract is doing double duty as a discovery mechanism. Include your study area, sensor data, method, key quantitative results, and the takeaway. Don't waste words on generic background sentences like "Remote sensing has been widely applied to monitor environmental change."
Prepare your figures at production quality from the start. MDPI's production team moves fast, and you won't have much time to fix figures between acceptance and publication. Make sure your maps have proper scale bars, north arrows, coordinate reference labels, and colorbar legends that are readable at print size.
Use the MDPI LaTeX template or Word template exactly as provided. Format violations during the initial check slow things down and can create a negative first impression.
Before submitting, consider running your manuscript through an AI-powered pre-submission review to check that your framing, methods description, and results presentation align with what this journal's editors and reviewers expect.
When to aim higher
Here's the honest calculation: if your paper introduces a method that works across multiple sensors and study areas, or if it reveals something genuinely new about an environmental process through remote sensing data, it doesn't belong at Remote Sensing (MDPI). Send it to RSE, ISPRS, or IEEE TGRS. The longer review time is worth it for the prestige and the more rigorous feedback you'll receive.
Remote Sensing (MDPI) is the right call when your work is technically solid, contributes useful knowledge to the field, but isn't going to redefine how people think about remote sensing. That's not an insult, it's the reality of where most research falls, and there's nothing wrong with publishing it in a journal that'll get it out quickly and make it freely available to everyone.
Bottom line
Remote Sensing (MDPI) offers speed, open access, and a relatively accessible editorial bar. It won't carry the prestige of RSE or IEEE TGRS, and the MDPI brand still raises eyebrows in some corners of academia. But for applied remote sensing work that needs timely publication and broad visibility, it's a legitimate option that thousands of researchers use every year. Just go in with clear expectations about what publishing here does and doesn't signal about your work.
- Scopus Source Record for Remote Sensing: https://www.scopus.com/sourceid/86430
- MDPI article processing charges: https://www.mdpi.com/apc
- IEEE Transactions on Geoscience and Remote Sensing author guidelines: https://ieeexplore.ieee.org/xpl/RecentIssue.jsp?punumber=36
- Remote Sensing of Environment journal homepage, Elsevier: https://www.sciencedirect.com/journal/remote-sensing-of-environment
Sources
- Remote Sensing journal homepage and author guidelines, MDPI: https://www.mdpi.com/journal/remotesensing
- Clarivate Journal Citation Reports (2024 edition): https://jcr.clarivate.com/
Reference library
Use the core publishing datasets alongside this guide
This article answers one part of the publishing decision. The reference library covers the recurring questions that usually come next: how selective journals are, how long review takes, and what the submission requirements look like across journals.
Dataset / reference guide
Peer Review Timelines by Journal
Reference-grade journal timeline data that authors, labs, and writing centers can cite when discussing realistic review timing.
Dataset / benchmark
Biomedical Journal Acceptance Rates
A field-organized acceptance-rate guide that works as a neutral benchmark when authors are deciding how selective to target.
Reference table
Journal Submission Specs
A high-utility submission table covering word limits, figure caps, reference limits, and formatting expectations.
Before you upload
Choose the next useful decision step first.
Move from this article into the next decision-support step. The scan works best once the journal and submission plan are clearer.
Use the scan once the manuscript and target journal are concrete enough to evaluate.
Anthropic Privacy Partner. Zero-retention manuscript processing.
Where to go next
Supporting reads
Conversion step
Choose the next useful decision step first.
Use the scan once the manuscript and target journal are concrete enough to evaluate.