Manual Log Upload
Upload log files directly through the Chipper web interface. Perfect for one-time uploads, testing, or when you don't have cloud storage configured.
When to Use Manual Upload
| Use Manual Upload If... | Use Cloud Automation If... |
|---|---|
| ✅ One-time file uploads | ❌ Continuous log ingestion |
| ✅ Testing/proof-of-concept | ❌ Production workloads |
| ✅ Small batches (< 100 files) | ❌ High volume (hundreds/thousands) |
| ✅ Files stored locally | ❌ Files already in S3/Azure |
| ✅ No cloud infrastructure | ❌ Using AWS/Azure |
Tip: Manual upload is great for getting started, but consider AWS or Azure automation for production use.
How It Works
graph LR
A[Your Computer] -->|Upload via Web UI| B[Chipper]
B -->|Processes| C[Pipeline]
C -->|Stores| D[Dashboards]
File Storage: Files are temporarily stored in Chipper, processed, then discarded
Retention: Parsed log data is retained; original files are not
Limits: Individual file size limit of 100MB per file
Upload Methods
Method 1: Drag and Drop
-
Navigate to Upload Page
- Go to Log Sources → Manual Upload
-
Drag Files
- Drag one or more log files from your computer
- Drop them in the upload area
-
Processing Begins
- Files are uploaded and processed immediately
- Progress bar shows status
Method 2: File Picker
-
Navigate to Upload Page
- Go to Log Sources → Manual Upload
-
Click Upload
- Click the "Select Files" or "Browse" button
- Choose one or more files from your computer
-
Upload
- Click "Upload" to begin processing
Supported File Formats
Plain Text Logs
.log- Standard log files.txt- Text files- Raw/no extension - Plain text
Structured Formats
.json- JSON logs (line-delimited or array).csv- Comma-separated values.tsv- Tab-separated values
Compressed Formats
.gz- Gzip compressed.zip- ZIP archives (all files extracted).bz2- Bzip2 compressed.tar.gz- Tarball archives
Note: Compressed files are automatically extracted. All extracted files must be supported formats.
File Size & Limits
| Limit | Value |
|---|---|
| Max file size | 100 MB per file |
| Max files per upload | 50 files |
| Max total upload size | 500 MB per batch |
For larger files:
- Split into smaller chunks
- Use compression (e.g., gzip)
- Consider automated cloud ingestion (AWS / Azure)
Upload Interface
When you navigate to Manual Upload, you'll see:
Upload Area
┌─────────────────────────────────────────┐
│ │
│ 📁 Drag files here │
│ or click to browse │
│ │
│ Supported: .log, .json, .csv, .gz │
│ Max size: 100 MB per file │
│ │
└─────────────────────────────────────────┘
File List
After selecting files, you'll see:
| File Name | Size | Status |
|---|---|---|
app.log | 2.3 MB | ⏳ Uploading (45%) |
error.log | 1.8 MB | ✅ Processed |
debug.log | 5.1 MB | ⏳ Queued |
Actions
- Remove: Remove file from upload queue (before processing)
- Cancel All: Cancel entire batch
- Retry: Retry failed uploads
Processing Status
Status Indicators
| Status | Icon | Meaning |
|---|---|---|
| Queued | ⏳ | Waiting to upload |
| Uploading | ↗️ | Transferring file |
| Processing | ⚙️ | Parsing log entries |
| Complete | ✅ | Successfully processed |
| Failed | ❌ | Error occurred |
View Processed Logs
After upload completes:
- Go to Dashboards
- Uploaded logs appear immediately
- Use filters to find specific entries
Common Issues & Solutions
File Upload Failed
Symptom: ❌ Error during upload
Common Causes:
- File size exceeds 100 MB
- Network connection interrupted
- Unsupported file format
Solutions:
- Compress large files (gzip recommended)
- Check internet connection
- Verify file format is supported
- Try uploading smaller batches
File Processed but No Data
Symptom: ✅ Upload succeeded, but no logs in dashboard
Common Causes:
- File is empty
- File format not recognized
- Log entries don't match expected patterns
Solutions:
- Open file locally to verify contents
- Check file format (JSON should be line-delimited or array)
- Try a known-good sample file first
Slow Upload
Symptom: Upload takes a long time
Causes:
- Large file size
- Slow internet connection
- Many files in batch
Solutions:
- Compress files before uploading (
.gz) - Upload smaller batches
- Use cloud automation for large volumes (AWS / Azure)
Best Practices
Before Upload
✅ Verify file formats - Ensure files are supported types
✅ Compress large files - Use gzip to reduce size (10x reduction typical)
✅ Test with small sample - Upload one file first to verify parsing
✅ Remove sensitive data - Redact credentials, PII if needed
During Upload
✅ Monitor progress - Watch for errors in file list
✅ Don't close browser - Keep page open until complete
✅ Batch similar files - Group by application or date
After Upload
✅ Verify in dashboard - Check logs appeared correctly
✅ Review parsing - Ensure fields extracted properly
✅ Delete local files (optional) - Chipper has the data
✅ Set up alerts - Create alerts for important patterns
Example: Upload Workflow
Scenario: Uploading Application Logs
-
Collect Logs
# On your server
cd /var/log/myapp
tar -czf logs-2024-01-15.tar.gz *.log -
Download to Local Machine
scp server:/var/log/myapp/logs-2024-01-15.tar.gz ~/Desktop/ -
Upload to Chipper
- Go to Manual Upload in Chipper
- Drag
logs-2024-01-15.tar.gzto upload area - Wait for processing (tar.gz is auto-extracted)
-
View in Dashboard
- Go to Dashboards
- Filter by date:
2024-01-15 - Analyze logs
Comparison: Manual vs Automated
| Feature | Manual Upload | AWS/Azure Automated |
|---|---|---|
| Setup Time | ✅ Instant | ⚠️ 10-15 minutes |
| Ongoing Effort | ❌ Manual per upload | ✅ Automatic |
| File Limits | ⚠️ 100 MB per file | ✅ No limit |
| Latency | ⚠️ On-demand only | ✅ Hourly or real-time |
| Best For | Testing, one-offs | Production, continuous |
Automation Alternatives
When You Outgrow Manual Upload
Consider automated ingestion when:
- ❌ Uploading files daily or more frequently
- ❌ Files exceed 100 MB regularly
- ❌ Need near-real-time ingestion
- ❌ Managing many applications/services
Migration Path
- Start: Manual upload for testing
- Grow: Set up AWS Pull or Azure Pull for automated polling
- Scale: Add Push notifications for real-time ingestion
Security Considerations
Data Transmission
✅ HTTPS encryption - All uploads encrypted in transit
✅ Secure storage - Files processed in isolated environment
✅ Automatic deletion - Original files discarded after processing
Sensitive Data
⚠️ Review before upload:
- Remove credentials (API keys, passwords)
- Redact PII (names, emails, addresses) if required
- Mask sensitive IP addresses if needed
Tip: Use automated tools to sanitize logs before upload:
# Example: Remove API keys
sed 's/api_key=[^&]*/api_key=REDACTED/g' app.log > app-clean.log
File Retention
| Data Type | Retention |
|---|---|
| Original files | Deleted immediately after processing |
| Parsed log entries | Retained per your plan (typically 30-90 days) |
| Metadata | Upload history retained indefinitely |
Note: Only parsed log data is retained - original files are not stored.
Next Steps
- ✅ Upload your first file - Try the manual upload interface
- ✅ Check dashboards - Verify logs appear correctly
- ✅ Create alerts - Set up alerts for important patterns
- ✅ Consider automation - When ready, set up AWS or Azure for continuous ingestion