Advanced
Handling Large Files
Performance optimizations for importing large datasets
ImportCSV automatically handles large files with optimizations that require no configuration.
Automatic Optimizations
Virtual Scrolling
Only ~20 rows are rendered in the DOM at any time, regardless of file size. Users can scroll through thousands of rows smoothly at 60fps.
Progressive Validation
- First 50 rows: Validated instantly on upload (< 100ms)
- Remaining rows: Validated in background chunks of 100 rows
- Progress indicator: Shows validation progress for large files
Memory Efficiency
Data is stored in memory-efficient structures. Constant memory usage regardless of visible rows - only the rendered rows are kept in the DOM.
What to Expect
| File Size | Experience |
|---|---|
| Under 1,000 rows | Instant - validation completes immediately |
| 1,000 - 10,000 rows | Fast - brief validation progress shown |
| 10,000+ rows | Works well - may see loading states |
| 50,000+ rows | Consider chunking on your server |
Tips for Best Performance
1. Use Built-in Validators
// Slower - complex regex
email: z.string().regex(/^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$/)
// Faster - use built-in validator
email: z.string().email()2. Keep Transforms Simple
// Only apply necessary transformations
const schema = z.object({
email: z.string()
.email()
.transform(s => s.trim().toLowerCase()) // Essential
// Don't add unnecessary transformations
});3. Process Results in Batches
const handleComplete = async (data: Transaction[]) => {
const BATCH_SIZE = 1000;
for (let i = 0; i < data.length; i += BATCH_SIZE) {
const batch = data.slice(i, i + BATCH_SIZE);
await sendToServer(batch);
}
};Server-Side Chunking
For very large files (50k+ rows), process in chunks before importing:
// Example: Split large file before import
async function handleLargeFile(file: File) {
const CHUNK_SIZE = 10000;
const text = await file.text();
const lines = text.split('\n');
const header = lines[0];
for (let i = 1; i < lines.length; i += CHUNK_SIZE) {
const chunk = [header, ...lines.slice(i, i + CHUNK_SIZE)].join('\n');
await processChunk(chunk);
}
}Troubleshooting
File takes too long to validate
- Simplify validators (use built-in types like
.email(),.url()) - Reduce number of required fields
- Consider server-side validation for 50k+ rows
Browser becomes unresponsive
- Check file size (should be < 50MB for browser processing)
- Close other tabs to free memory
- Try Chrome for best performance
Memory issues
- Process data in batches after import
- Consider pagination for displaying results