4 min read

Building a Full-Stack File Upload System with Laravel, Vue.js, and S3

Learn how to build a production-ready Laravel file upload system with Vue.js drag-and-drop, S3 storage, and real-time progress tracking.

Laravel Vue.js AWS S3 File Upload Drag and Drop
Building a Full-Stack File Upload System with Laravel, Vue.js, and S3

File uploads seem simple until you actually build them. You need validation, progress tracking, security, storage management, and a smooth user experience. And if you're building a SaaS product, you can't just dump files on your server, you need scalable cloud storage.

I've built file upload systems for multiple products over the years, and here's what I've learned: getting it right requires careful planning across your entire stack. You need a robust Laravel backend that handles validation and security, a responsive Vue.js frontend with drag-and-drop support, and proper S3 integration for scalable storage.

In this guide, I'll show you how to build a complete Laravel file upload system from scratch. We'll cover database design, backend API development, Vue.js component creation with progress tracking, and S3 configuration. By the end, you'll have a production-ready system that can handle multiple file uploads with real-time progress feedback.

Why This Stack Works for File Uploads

Laravel's built-in filesystem abstraction makes S3 integration straightforward. You write the same code whether files go to local storage or S3, just change the configuration. This flexibility saved me hours when migrating a client project from local storage to S3 as their user base grew.

Vue.js handles the frontend beautifully. Its reactive data system makes progress tracking natural, and the component architecture keeps your upload UI modular and reusable. Plus, Vue's file input handling is cleaner than vanilla JavaScript.

S3 provides unlimited scalable storage without managing servers. You pay only for what you use, and AWS handles availability, backups, and CDN integration through CloudFront if you need it later.

Database Schema Design

Let's start with the foundation. Here's the migration for our uploads table:

<?php

use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;

return new class extends Migration
{
    public function up(): void
    {
        Schema::create('uploads', function (Blueprint $table) {
            $table->id();
            $table->foreignId('user_id')->constrained()->cascadeOnDelete();
            $table->string('original_name');
            $table->string('filename'); // Sanitized stored name
            $table->string('path'); // S3 path
            $table->string('disk')->default('s3'); // Storage disk
            $table->string('mime_type');
            $table->unsignedBigInteger('size'); // Bytes
            $table->string('hash', 64)->nullable(); // SHA-256 for deduplication
            $table->json('metadata')->nullable(); // Image dimensions, etc.
            $table->timestamps();
            $table->softDeletes();
            
            // Index for finding duplicates
            $table->index(['user_id', 'hash']);
        });
    }

    public function down(): void
    {
        Schema::dropIfExists('uploads');
    }
};

I include hash for deduplication, if a user uploads the same file twice, you can detect it and save storage costs. The metadata JSON column is useful for storing image dimensions, video duration, or any file-specific data you need later.

Setting Up S3 Configuration

First, install the AWS SDK:

composer require league/flysystem-aws-s3-v3 "^3.0"

Then configure your .env file:

AWS_ACCESS_KEY_ID=your_access_key
AWS_SECRET_ACCESS_KEY=your_secret_key
AWS_DEFAULT_REGION=us-east-1
AWS_BUCKET=your-bucket-name
AWS_USE_PATH_STYLE_ENDPOINT=false

Laravel includes S3 configuration by default in config/filesystems.php, but here's what matters:

's3' => [
    'driver' => 's3',
    'key' => env('AWS_ACCESS_KEY_ID'),
    'secret' => env('AWS_SECRET_ACCESS_KEY'),
    'region' => env('AWS_DEFAULT_REGION'),
    'bucket' => env('AWS_BUCKET'),
    'url' => env('AWS_URL'),
    'endpoint' => env('AWS_ENDPOINT'),
    'use_path_style_endpoint' => env('AWS_USE_PATH_STYLE_ENDPOINT', false),
    'throw' => false,
    'visibility' => 'private', // Important for security
],

Set visibility to private by default. You don't want uploaded files publicly accessible without authentication. We'll generate signed URLs when users need to download files.

Building the Upload Model

Here's a clean Upload model with useful methods:

<?php

namespace App\Models;

use Illuminate\Database\Eloquent\Model;
use Illuminate\Database\Eloquent\Relations\BelongsTo;
use Illuminate\Database\Eloquent\SoftDeletes;
use Illuminate\Support\Facades\Storage;

class Upload extends Model
{
    use SoftDeletes;

    protected $fillable = [
        'user_id',
        'original_name',
        'filename',
        'path',
        'disk',
        'mime_type',
        'size',
        'hash',
        'metadata',
    ];

    protected $casts = [
        'metadata' => 'array',
        'size' => 'integer',
    ];

    public function user(): BelongsTo
    {
        return $this->belongsTo(User::class);
    }

    /**
     * Get a temporary signed URL for downloading
     */
    public function getTemporaryUrl(int $minutes = 5): string
    {
        return Storage::disk($this->disk)->temporaryUrl(
            $this->path,
            now()->addMinutes($minutes)
        );
    }

    /**
     * Delete the file from storage
     */
    public function deleteFile(): bool
    {
        return Storage::disk($this->disk)->delete($this->path);
    }

    /**
     * Get human-readable file size
     */
    public function getFormattedSizeAttribute(): string
    {
        $units = ['B', 'KB', 'MB', 'GB'];
        $size = $this->size;
        
        for ($i = 0; $size > 1024 && $i < count($units) - 1; $i++) {
            $size /= 1024;
        }
        
        return round($size, 2) . ' ' . $units[$i];
    }

    /**
     * Check if file is an image
     */
    public function isImage(): bool
    {
        return str_starts_with($this->mime_type, 'image/');
    }
}

The getTemporaryUrl method is crucial. Since your S3 files are private, you generate signed URLs that expire after a few minutes. This keeps files secure while allowing legitimate downloads.

Creating the Upload Service

I always extract file handling logic into a service class. Controllers should be thin — they receive requests and return responses. Business logic belongs in services.

<?php

namespace App\Services;

use App\Models\Upload;
use Illuminate\Http\UploadedFile;
use Illuminate\Support\Facades\Storage;
use Illuminate\Support\Str;

class UploadService
{
    public function __construct(
        private string $disk = 's3'
    ) {}

    /**
     * Handle file upload and create database record
     */
    public function upload(UploadedFile $file, int $userId): Upload
    {
        // Generate unique filename
        $filename = $this->generateFilename($file);
        
        // Calculate file hash for deduplication
        $hash = hash_file('sha256', $file->getRealPath());
        
        // Check for existing file with same hash
        $existing = Upload::where('user_id', $userId)
            ->where('hash', $hash)
            ->first();
            
        if ($existing) {
            return $existing; // Return existing upload
        }
        
        // Store file
        $path = Storage::disk($this->disk)->putFileAs(
            'uploads/' . date('Y/m'),
            $file,
            $filename
        );
        
        // Extract metadata for images
        $metadata = $this->extractMetadata($file);
        
        // Create database record
        return Upload::create([
            'user_id' => $userId,
            'original_name' => $file->getClientOriginalName(),
            'filename' => $filename,
            'path' => $path,
            'disk' => $this->disk,
            'mime_type' => $file->getMimeType(),
            'size' => $file->getSize(),
            'hash' => $hash,
            'metadata' => $metadata,
        ]);
    }

    /**
     * Upload multiple files
     */
    public function uploadMultiple(array $files, int $userId): array
    {
        $uploads = [];
        
        foreach ($files as $file) {
            $uploads[] = $this->upload($file, $userId);
        }
        
        return $uploads;
    }

    /**
     * Delete upload and file from storage
     */
    public function delete(Upload $upload): bool
    {
        $upload->deleteFile();
        return $upload->delete();
    }

    /**
     * Generate unique filename preserving extension
     */
    private function generateFilename(UploadedFile $file): string
    {
        $extension = $file->getClientOriginalExtension();
        return Str::uuid() . '.' . $extension;
    }

    /**
     * Extract metadata from file
     */
    private function extractMetadata(UploadedFile $file): ?array
    {
        if (!str_starts_with($file->getMimeType(), 'image/')) {
            return null;
        }
        
        $imageInfo = getimagesize($file->getRealPath());
        
        return [
            'width' => $imageInfo[0] ?? null,
            'height' => $imageInfo[1] ?? null,
        ];
    }
}

The deduplication logic saved a client thousands in S3 costs. Users often upload the same company logo or document multiple times, why store duplicates?

Building the Upload Controller

Now let's create the API endpoints. I'm using Laravel streamlined controller syntax:

<?php

namespace App\Http\Controllers\Api;

use App\Http\Controllers\Controller;
use App\Http\Requests\UploadRequest;
use App\Models\Upload;
use App\Services\UploadService;
use Illuminate\Http\JsonResponse;

class UploadController extends Controller
{
    public function __construct(
        private UploadService $uploadService
    ) {}

    /**
     * Upload single or multiple files
     */
    public function store(UploadRequest $request): JsonResponse
    {
        $files = $request->file('files');
        $userId = auth()->id();
        
        // Handle single file
        if (!is_array($files)) {
            $upload = $this->uploadService->upload($files, $userId);
            
            return response()->json([
                'success' => true,
                'upload' => $this->formatUpload($upload),
            ], 201);
        }
        
        // Handle multiple files
        $uploads = $this->uploadService->uploadMultiple($files, $userId);
        
        return response()->json([
            'success' => true,
            'uploads' => array_map(
                fn($upload) => $this->formatUpload($upload),
                $uploads
            ),
        ], 201);
    }

    /**
     * Get user's uploads
     */
    public function index(): JsonResponse
    {
        $uploads = Upload::where('user_id', auth()->id())
            ->latest()
            ->paginate(20);
            
        return response()->json([
            'uploads' => $uploads->map(fn($upload) => $this->formatUpload($upload)),
            'pagination' => [
                'current_page' => $uploads->currentPage(),
                'total' => $uploads->total(),
                'per_page' => $uploads->perPage(),
            ],
        ]);
    }

    /**
     * Get download URL for file
     */
    public function download(Upload $upload): JsonResponse
    {
        $this->authorize('view', $upload);
        
        return response()->json([
            'url' => $upload->getTemporaryUrl(5),
            'expires_in' => 300, // seconds
        ]);
    }

    /**
     * Delete upload
     */
    public function destroy(Upload $upload): JsonResponse
    {
        $this->authorize('delete', $upload);
        
        $this->uploadService->delete($upload);
        
        return response()->json([
            'success' => true,
            'message' => 'Upload deleted successfully',
        ]);
    }

    /**
     * Format upload for API response
     */
    private function formatUpload(Upload $upload): array
    {
        return [
            'id' => $upload->id,
            'original_name' => $upload->original_name,
            'mime_type' => $upload->mime_type,
            'size' => $upload->size,
            'formatted_size' => $upload->formatted_size,
            'is_image' => $upload->isImage(),
            'metadata' => $upload->metadata,
            'created_at' => $upload->created_at->toISOString(),
        ];
    }
}

Notice I'm using policy authorization with $this->authorize(). You don't want users downloading or deleting each other's files.

Creating the Upload Request Validator

Validation is critical for security. Here's the form request:

<?php

namespace App\Http\Requests;

use Illuminate\Foundation\Http\FormRequest;

class UploadRequest extends FormRequest
{
    public function authorize(): bool
    {
        return auth()->check();
    }

    public function rules(): array
    {
        return [
            'files' => ['required'],
            'files.*' => [
                'file',
                'max:10240', // 10MB max per file
                'mimes:jpg,jpeg,png,gif,pdf,doc,docx,xls,xlsx,zip',
            ],
        ];
    }

    public function messages(): array
    {
        return [
            'files.*.max' => 'Each file must not exceed 10MB',
            'files.*.mimes' => 'Invalid file type. Allowed: images, PDFs, documents, spreadsheets, ZIP files',
        ];
    }
}

Adjust the max size and mimes types based on your needs. I kept it to 10MB here, but I've worked on projects requiring 100MB+ video uploads — just make sure your php.ini settings allow it:

upload_max_filesize = 100M
post_max_size = 100M
max_execution_time = 300

Building the Vue.js Upload Component

Now for the frontend magic. This Vue 3 component handles drag-and-drop, multiple files, and progress tracking:

<template>
  <div class="upload-container">
    <div
      class="drop-zone"
      :class="{ 'drag-over': isDragging }"
      @drop.prevent="handleDrop"
      @dragover.prevent="isDragging = true"
      @dragleave="isDragging = false"
    >
      <input
        type="file"
        ref="fileInput"
        multiple
        @change="handleFileSelect"
        class="hidden"
      />
      
      <div v-if="!uploading" class="drop-zone-content">
        <svg class="upload-icon" /* ... icon SVG ... */ />
        <p class="drop-text">
          Drag files here or 
          <button @click="$refs.fileInput.click()" class="browse-btn">
            browse
          </button>
        </p>
        <p class="file-info">
          Supports: JPG, PNG, PDF, DOC, XLS, ZIP (max 10MB each)
        </p>
      </div>

      <div v-else class="upload-progress">
        <div v-for="file in files" :key="file.name" class="file-progress">
          <div class="file-info">
            <span class="file-name">{{ file.name }}</span>
            <span class="file-size">{{ formatSize(file.size) }}</span>
          </div>
          <div class="progress-bar">
            <div 
              class="progress-fill"
              :style="{ width: file.progress + '%' }"
            />
          </div>
          <span class="progress-text">{{ file.progress }}%</span>
        </div>
      </div>
    </div>

    <div v-if="uploads.length > 0" class="uploads-list">
      <h3>Recent Uploads</h3>
      <div v-for="upload in uploads" :key="upload.id" class="upload-item">
        <div class="upload-info">
          <span class="upload-name">{{ upload.original_name }}</span>
          <span class="upload-size">{{ upload.formatted_size }}</span>
        </div>
        <div class="upload-actions">
          <button @click="downloadFile(upload)" class="btn-download">
            Download
          </button>
          <button @click="deleteFile(upload)" class="btn-delete">
            Delete
          </button>
        </div>
      </div>
    </div>
  </div>
</template>

<script setup>
import { ref, onMounted } from 'vue';
import axios from 'axios';

const fileInput = ref(null);
const isDragging = ref(false);
const uploading = ref(false);
const files = ref([]);
const uploads = ref([]);

const handleFileSelect = (event) => {
  const selectedFiles = Array.from(event.target.files);
  uploadFiles(selectedFiles);
};

const handleDrop = (event) => {
  isDragging.value = false;
  const droppedFiles = Array.from(event.dataTransfer.files);
  uploadFiles(droppedFiles);
};

const uploadFiles = async (fileList) => {
  uploading.value = true;
  
  // Initialize progress tracking
  files.value = fileList.map(file => ({
    name: file.name,
    size: file.size,
    progress: 0,
  }));
  
  const formData = new FormData();
  fileList.forEach(file => {
    formData.append('files[]', file);
  });
  
  try {
    const response = await axios.post('/api/uploads', formData, {
      headers: {
        'Content-Type': 'multipart/form-data',
      },
      onUploadProgress: (progressEvent) => {
        const percentCompleted = Math.round(
          (progressEvent.loaded * 100) / progressEvent.total
        );
        
        // Update progress for all files
        files.value.forEach(file => {
          file.progress = percentCompleted;
        });
      },
    });
    
    // Add new uploads to list
    const newUploads = Array.isArray(response.data.uploads)
      ? response.data.uploads
      : [response.data.upload];
      
    uploads.value.unshift(...newUploads);
    
    // Reset
    files.value = [];
    uploading.value = false;
    fileInput.value.value = '';
    
  } catch (error) {
    console.error('Upload failed:', error);
    alert('Upload failed: ' + (error.response?.data?.message || 'Unknown error'));
    uploading.value = false;
  }
};

const downloadFile = async (upload) => {
  try {
    const response = await axios.get(`/api/uploads/${upload.id}/download`);
    window.open(response.data.url, '_blank');
  } catch (error) {
    console.error('Download failed:', error);
    alert('Download failed');
  }
};

const deleteFile = async (upload) => {
  if (!confirm('Delete this file?')) return;
  
  try {
    await axios.delete(`/api/uploads/${upload.id}`);
    uploads.value = uploads.value.filter(u => u.id !== upload.id);
  } catch (error) {
    console.error('Delete failed:', error);
    alert('Delete failed');
  }
};

const formatSize = (bytes) => {
  const units = ['B', 'KB', 'MB', 'GB'];
  let size = bytes;
  let unitIndex = 0;
  
  while (size > 1024 && unitIndex < units.length - 1) {
    size /= 1024;
    unitIndex++;
  }
  
  return `${size.toFixed(2)} ${units[unitIndex]}`;
};

const fetchUploads = async () => {
  try {
    const response = await axios.get('/api/uploads');
    uploads.value = response.data.uploads;
  } catch (error) {
    console.error('Failed to fetch uploads:', error);
  }
};

onMounted(() => {
  fetchUploads();
});
</script>

<style scoped>
.upload-container {
  max-width: 800px;
  margin: 0 auto;
  padding: 2rem;
}

.drop-zone {
  border: 2px dashed #cbd5e0;
  border-radius: 8px;
  padding: 3rem;
  text-align: center;
  transition: all 0.3s;
  background: #f7fafc;
}

.drop-zone.drag-over {
  border-color: #4299e1;
  background: #ebf8ff;
}

.drop-zone-content {
  display: flex;
  flex-direction: column;
  align-items: center;
  gap: 1rem;
}

.upload-icon {
  width: 64px;
  height: 64px;
  color: #a0aec0;
}

.browse-btn {
  color: #4299e1;
  text-decoration: underline;
  background: none;
  border: none;
  cursor: pointer;
  padding: 0;
}

.hidden {
  display: none;
}

.file-progress {
  margin-bottom: 1rem;
}

.progress-bar {
  height: 8px;
  background: #e2e8f0;
  border-radius: 4px;
  overflow: hidden;
  margin: 0.5rem 0;
}

.progress-fill {
  height: 100%;
  background: #4299e1;
  transition: width 0.3s;
}

.uploads-list {
  margin-top: 2rem;
}

.upload-item {
  display: flex;
  justify-content: space-between;
  align-items: center;
  padding: 1rem;
  border: 1px solid #e2e8f0;
  border-radius: 4px;
  margin-bottom: 0.5rem;
}

.btn-download,
.btn-delete {
  padding: 0.5rem 1rem;
  border-radius: 4px;
  border: none;
  cursor: pointer;
  margin-left: 0.5rem;
}

.btn-download {
  background: #4299e1;
  color: white;
}

.btn-delete {
  background: #fc8181;
  color: white;
}
</style>

This component does everything: drag-and-drop detection, progress bars for each file, upload list management, and download/delete actions. The onUploadProgress callback from Axios makes progress tracking simple.

Handling Large File Uploads

For files over 100MB, consider chunked uploads. The approach involves splitting files into chunks on the frontend and uploading them sequentially:

const uploadLargeFile = async (file) => {
  const chunkSize = 5 * 1024 * 1024; // 5MB chunks
  const chunks = Math.ceil(file.size / chunkSize);
  
  for (let i = 0; i < chunks; i++) {
    const start = i * chunkSize;
    const end = Math.min(start + chunkSize, file.size);
    const chunk = file.slice(start, end);
    
    const formData = new FormData();
    formData.append('chunk', chunk);
    formData.append('chunk_index', i);
    formData.append('total_chunks', chunks);
    formData.append('filename', file.name);
    
    await axios.post('/api/uploads/chunk', formData);
  }
  
  // Finalize upload
  await axios.post('/api/uploads/finalize', {
    filename: file.name,
    total_chunks: chunks,
  });
};

You'll need backend endpoints to handle chunk assembly.

Security Best Practices

Never trust user uploads. Here's what I implement in every project:

1. Validate MIME types server-side Don't rely on file extensions. Check the actual file content:

use Illuminate\Support\Facades\File;

$mimeType = File::mimeType($file->getRealPath());
$allowedTypes = ['image/jpeg', 'image/png', 'application/pdf'];

if (!in_array($mimeType, $allowedTypes)) {
    throw new \Exception('Invalid file type');
}

2. Scan files for malware For production systems, integrate ClamAV or a cloud scanning service:

use Xenolope\Quahog\Client;

$scanner = new Client('tcp://127.0.0.1:3310');
$result = $scanner->scanFile($file->getRealPath());

if ($result['status'] === 'FOUND') {
    throw new \Exception('Malware detected');
}

3. Sanitize filenames User-provided filenames can contain path traversal attacks:

$filename = preg_replace('/[^a-zA-Z0-9._-]/', '', $originalName);

4. Set proper S3 permissions Your bucket policy should restrict public access:

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Deny",
      "Principal": "*",
      "Action": "s3:*",
      "Resource": "arn:aws:s3:::your-bucket/*",
      "Condition": {
        "Bool": {
          "aws:SecureTransport": "false"
        }
      }
    }
  ]
}

Optimizing Upload Performance

I've reduced upload times by 40% with these optimizations:

Use direct S3 uploads from the frontend

Instead of uploading through your Laravel server, generate presigned URLs and upload directly to S3:

public function getPresignedUrl(Request $request): JsonResponse
{
    $filename = Str::uuid() . '.' . $request->input('extension');
    $path = 'uploads/' . date('Y/m') . '/' . $filename;
    
    $url = Storage::disk('s3')->temporaryUrl(
        $path,
        now()->addMinutes(10),
        [
            'ResponseContentType' => $request->input('mime_type'),
        ]
    );
    
    return response()->json([
        'url' => $url,
        'path' => $path,
    ]);
}

Then upload from Vue:

const uploadDirectToS3 = async (file) => {
  // Get presigned URL
  const { data } = await axios.post('/api/uploads/presigned-url', {
    extension: file.name.split('.').pop(),
    mime_type: file.type,
  });
  
  // Upload directly to S3
  await axios.put(data.url, file, {
    headers: {
      'Content-Type': file.type,
    },
  });
  
  // Register upload in database
  await axios.post('/api/uploads/register', {
    path: data.path,
    original_name: file.name,
  });
};

This eliminates your server as a bottleneck. Your Laravel app just generates URLs and tracks uploads, S3 handles the heavy lifting.

Common Upload Mistakes to Avoid

I've debugged these issues more times than I'd like to admit:

1. Forgetting CORS configuration If you're doing direct S3 uploads, configure CORS on your bucket:

[
  {
    "AllowedHeaders": ["*"],
    "AllowedMethods": ["PUT", "POST"],
    "AllowedOrigins": ["https://yourdomain.com"],
    "ExposeHeaders": ["ETag"]
  }
]

2. Not handling upload failures Network drops happen. Implement retry logic:

const uploadWithRetry = async (file, maxRetries = 3) => {
  for (let i = 0; i < maxRetries; i++) {
    try {
      return await uploadFile(file);
    } catch (error) {
      if (i === maxRetries - 1) throw error;
      await new Promise(resolve => setTimeout(resolve, 1000 * (i + 1)));
    }
  }
};

3. Ignoring file size limits Check file sizes before upload to save bandwidth:

const MAX_FILE_SIZE = 10 * 1024 * 1024; // 10MB

if (file.size > MAX_FILE_SIZE) {
  alert('File too large. Maximum size: 10MB');
  return;
}

4. Not cleaning up failed uploads S3 charges for storage. Delete incomplete uploads with lifecycle rules or a cleanup job:

// Command to clean up old unregistered files
public function handle()
{
    $registeredPaths = Upload::pluck('path')->toArray();
    $s3Files = Storage::disk('s3')->allFiles('uploads');
    
    foreach ($s3Files as $file) {
        if (!in_array($file, $registeredPaths)) {
            $fileTime = Storage::disk('s3')->lastModified($file);
            
            if ($fileTime < now()->subDays(7)->timestamp) {
                Storage::disk('s3')->delete($file);
            }
        }
    }
}

Testing Your Upload System

I test uploads with PHPUnit and Pest. Here's a basic test:

use Illuminate\Http\UploadedFile;
use Illuminate\Support\Facades\Storage;

test('user can upload file', function () {
    Storage::fake('s3');
    
    $file = UploadedFile::fake()->image('test.jpg', 1000, 1000)->size(100);
    
    $response = $this->actingAs($user)
        ->post('/api/uploads', [
            'files' => [$file],
        ]);
    
    $response->assertStatus(201);
    $response->assertJsonStructure([
        'success',
        'uploads' => [
            '*' => ['id', 'original_name', 'size'],
        ],
    ]);
    
    Storage::disk('s3')->assertExists('uploads/' . date('Y/m') . '/' . $file->hashName());
});

Use Storage::fake() to avoid hitting S3 during tests. It's faster and doesn't cost money.

Alternative Approaches and Trade-offs

This approach works great for most apps, but there are alternatives:

Direct browser uploads with AWS SDK Use the AWS JavaScript SDK directly in your frontend. Pro: fastest uploads, no backend processing. Con: exposes AWS credentials (even if temporary), more complex frontend code.

Using Laravel Media Library Spatie's Media Library package handles uploads elegantly. Pro: battle-tested, handles conversions and variants automatically. Con: adds dependency, might be overkill for simple uploads.

Serverless with Lambda Process uploads asynchronously with Lambda functions triggered by S3 events. Pro: infinite scalability, cost-effective. Con: more complex architecture, requires AWS expertise.

I prefer the approach in this guide because it balances simplicity with production-readiness. You get clean separation of concerns, proper security, and good performance without architectural complexity.

Wrapping Up

You now have a complete Laravel file upload system with Vue.js, S3, and progress tracking. The key pieces are:

  • Solid database schema with deduplication
  • Service layer for business logic separation
  • Secure validation and authorization
  • Responsive Vue component with drag-and-drop
  • Private S3 storage with temporary signed URLs
  • Progress tracking and error handling

I use variations of this system in multiple products. It scales well, I've handled millions of uploads without issues. The architecture is clean enough that adding features like image resizing, virus scanning, or zip generation is straightforward.

Start with this foundation and adapt it to your needs. Maybe you need different file types, larger size limits, or thumbnail generation. The patterns here give you a solid base to build on.

Need help implementing file uploads for your Laravel project? Let's work together: Contact me


Need Help With Your Laravel Project?

I specialize in building custom Laravel applications, process automation, and SaaS development. Whether you need to eliminate repetitive tasks or build something from scratch, let's discuss your project.

⚡ Currently available for 2-3 new projects

Hafiz Riaz

About Hafiz Riaz

Full Stack Developer from Turin, Italy. I build web applications with Laravel and Vue.js, and automate business processes. Creator of ReplyGenius, StudyLab, and other SaaS products.

View Portfolio →