Import & Export
Move data in and out of MongoDB efficiently with support for multiple formats, large datasets, and intelligent data transformations. Agent M handles the complexity while you focus on your data.
Export Formats
Choose the right export format based on your needs. Each format is optimized for different use cases, from data analysis to backup and sharing.
JSON
Native MongoDB format preserving all data types and structures
Best for: Best for backup, data migration, and preserving complex documents
Excel
Formatted spreadsheets with multiple sheets for related collections
Best for: Perfect for sharing data with non-technical stakeholders
CSV
Comma-separated values for simple data analysis and reporting
Best for: Ideal for data analysis tools and simple integrations
Import Sources
Agent M supports importing data from various sources with intelligent parsing and validation. The AI assistant helps map fields and suggests optimal data structures.
Spreadsheet Files
Excel, CSV, and TSV files with intelligent column mapping and data validation
JSON Data
MongoDB exports, API responses, and structured JSON documents
Database Dumps
MongoDB dumps, SQL exports, and other database backup formats
Compressed Archives
ZIP and TAR archives containing multiple data files
Advanced Features
Beyond basic import and export, Agent M offers advanced data movement capabilities for enterprise use cases and complex data workflows.
Bidirectional Sync
Two-way data synchronization with external systems and databases
Custom Transformations
Apply data transformations during import/export with custom rules
Secure Processing
Encrypted data handling with secure temporary file management
Scheduled Operations
Automate regular imports and exports with flexible scheduling
Large Dataset Handling
Optimized for Scale
Performance Features
- • Streaming processing for memory efficiency
- • Parallel processing for faster operations
- • Progress tracking with ETA calculations
- • Resume capability for interrupted operations
Scale Limits
- • Files up to 10GB in size
- • Millions of documents per operation
- • Batch processing with configurable sizes
- • Multi-collection operations
Best Practices
Export Best Practices
- •Choose JSON format for complete data fidelity and backup purposes
- •Use CSV for data analysis and sharing with external tools
- •Export large datasets in batches to manage memory usage
- •Include metadata and schema information for complex exports
Import Best Practices
- •Always validate data quality before importing large datasets
- •Test imports with a small sample first to verify mapping
- •Use staging collections for large imports before moving to production
- •Monitor import progress and have rollback plans for critical data