Skip to main content

Manual Log Upload

Upload log files directly through the Chipper web interface. Perfect for one-time uploads, testing, or when you don't have cloud storage configured.


When to Use Manual Upload

Use Manual Upload If...Use Cloud Automation If...
✅ One-time file uploads❌ Continuous log ingestion
✅ Testing/proof-of-concept❌ Production workloads
✅ Small batches (< 100 files)❌ High volume (hundreds/thousands)
✅ Files stored locally❌ Files already in S3/Azure
✅ No cloud infrastructure❌ Using AWS/Azure

Tip: Manual upload is great for getting started, but consider AWS or Azure automation for production use.


How It Works

graph LR
A[Your Computer] -->|Upload via Web UI| B[Chipper]
B -->|Processes| C[Pipeline]
C -->|Stores| D[Dashboards]

File Storage: Files are temporarily stored in Chipper, processed, then discarded
Retention: Parsed log data is retained; original files are not
Limits: Individual file size limit of 100MB per file


Upload Methods

Method 1: Drag and Drop

  1. Navigate to Upload Page

    • Go to Log SourcesManual Upload
  2. Drag Files

    • Drag one or more log files from your computer
    • Drop them in the upload area
  3. Processing Begins

    • Files are uploaded and processed immediately
    • Progress bar shows status

Method 2: File Picker

  1. Navigate to Upload Page

    • Go to Log SourcesManual Upload
  2. Click Upload

    • Click the "Select Files" or "Browse" button
    • Choose one or more files from your computer
  3. Upload

    • Click "Upload" to begin processing

Supported File Formats

Plain Text Logs

  • .log - Standard log files
  • .txt - Text files
  • Raw/no extension - Plain text

Structured Formats

  • .json - JSON logs (line-delimited or array)
  • .csv - Comma-separated values
  • .tsv - Tab-separated values

Compressed Formats

  • .gz - Gzip compressed
  • .zip - ZIP archives (all files extracted)
  • .bz2 - Bzip2 compressed
  • .tar.gz - Tarball archives

Note: Compressed files are automatically extracted. All extracted files must be supported formats.


File Size & Limits

LimitValue
Max file size100 MB per file
Max files per upload50 files
Max total upload size500 MB per batch

For larger files:

  • Split into smaller chunks
  • Use compression (e.g., gzip)
  • Consider automated cloud ingestion (AWS / Azure)

Upload Interface

When you navigate to Manual Upload, you'll see:

Upload Area

┌─────────────────────────────────────────┐
│ │
│ 📁 Drag files here │
│ or click to browse │
│ │
│ Supported: .log, .json, .csv, .gz │
│ Max size: 100 MB per file │
│ │
└─────────────────────────────────────────┘

File List

After selecting files, you'll see:

File NameSizeStatus
app.log2.3 MB⏳ Uploading (45%)
error.log1.8 MB✅ Processed
debug.log5.1 MB⏳ Queued

Actions

  • Remove: Remove file from upload queue (before processing)
  • Cancel All: Cancel entire batch
  • Retry: Retry failed uploads

Processing Status

Status Indicators

StatusIconMeaning
QueuedWaiting to upload
Uploading↗️Transferring file
Processing⚙️Parsing log entries
CompleteSuccessfully processed
FailedError occurred

View Processed Logs

After upload completes:

  1. Go to Dashboards
  2. Uploaded logs appear immediately
  3. Use filters to find specific entries

Common Issues & Solutions

File Upload Failed

Symptom: ❌ Error during upload

Common Causes:

  • File size exceeds 100 MB
  • Network connection interrupted
  • Unsupported file format

Solutions:

  • Compress large files (gzip recommended)
  • Check internet connection
  • Verify file format is supported
  • Try uploading smaller batches

File Processed but No Data

Symptom: ✅ Upload succeeded, but no logs in dashboard

Common Causes:

  • File is empty
  • File format not recognized
  • Log entries don't match expected patterns

Solutions:

  • Open file locally to verify contents
  • Check file format (JSON should be line-delimited or array)
  • Try a known-good sample file first

Slow Upload

Symptom: Upload takes a long time

Causes:

  • Large file size
  • Slow internet connection
  • Many files in batch

Solutions:

  • Compress files before uploading (.gz)
  • Upload smaller batches
  • Use cloud automation for large volumes (AWS / Azure)

Best Practices

Before Upload

Verify file formats - Ensure files are supported types
Compress large files - Use gzip to reduce size (10x reduction typical)
Test with small sample - Upload one file first to verify parsing
Remove sensitive data - Redact credentials, PII if needed

During Upload

Monitor progress - Watch for errors in file list
Don't close browser - Keep page open until complete
Batch similar files - Group by application or date

After Upload

Verify in dashboard - Check logs appeared correctly
Review parsing - Ensure fields extracted properly
Delete local files (optional) - Chipper has the data
Set up alerts - Create alerts for important patterns


Example: Upload Workflow

Scenario: Uploading Application Logs

  1. Collect Logs

    # On your server
    cd /var/log/myapp
    tar -czf logs-2024-01-15.tar.gz *.log
  2. Download to Local Machine

    scp server:/var/log/myapp/logs-2024-01-15.tar.gz ~/Desktop/
  3. Upload to Chipper

    • Go to Manual Upload in Chipper
    • Drag logs-2024-01-15.tar.gz to upload area
    • Wait for processing (tar.gz is auto-extracted)
  4. View in Dashboard

    • Go to Dashboards
    • Filter by date: 2024-01-15
    • Analyze logs

Comparison: Manual vs Automated

FeatureManual UploadAWS/Azure Automated
Setup Time✅ Instant⚠️ 10-15 minutes
Ongoing Effort❌ Manual per upload✅ Automatic
File Limits⚠️ 100 MB per file✅ No limit
Latency⚠️ On-demand only✅ Hourly or real-time
Best ForTesting, one-offsProduction, continuous

Automation Alternatives

When You Outgrow Manual Upload

Consider automated ingestion when:

  • ❌ Uploading files daily or more frequently
  • ❌ Files exceed 100 MB regularly
  • ❌ Need near-real-time ingestion
  • ❌ Managing many applications/services

Migration Path

  1. Start: Manual upload for testing
  2. Grow: Set up AWS Pull or Azure Pull for automated polling
  3. Scale: Add Push notifications for real-time ingestion

Security Considerations

Data Transmission

HTTPS encryption - All uploads encrypted in transit
Secure storage - Files processed in isolated environment
Automatic deletion - Original files discarded after processing

Sensitive Data

⚠️ Review before upload:

  • Remove credentials (API keys, passwords)
  • Redact PII (names, emails, addresses) if required
  • Mask sensitive IP addresses if needed

Tip: Use automated tools to sanitize logs before upload:

# Example: Remove API keys
sed 's/api_key=[^&]*/api_key=REDACTED/g' app.log > app-clean.log

File Retention

Data TypeRetention
Original filesDeleted immediately after processing
Parsed log entriesRetained per your plan (typically 30-90 days)
MetadataUpload history retained indefinitely

Note: Only parsed log data is retained - original files are not stored.


Next Steps

  • Upload your first file - Try the manual upload interface
  • Check dashboards - Verify logs appear correctly
  • Create alerts - Set up alerts for important patterns
  • Consider automation - When ready, set up AWS or Azure for continuous ingestion

Need Help?

  • Upload failing? Check file size and format
  • No data in dashboards? Verify file contents and format
  • Want to automate? See AWS or Azure setup guides
  • Other questions? Contact support with upload details