List S3 Buckets

Overview

Multi-Profile S3 Management

Multi-Profile S3 Safari! Ever juggled multiple AWS accounts and needed a quick S3 bucket inventory across all of them? This Python script is your guide!

Use Case

Perfect for organizations managing multiple AWS accounts or developers working with different IAM roles and profiles.

The Problem

Multi-Profile Complexity

When you’re working with multiple AWS accounts or IAM roles through different profiles, getting a consolidated view of your S3 buckets can be a hassle. Switching contexts or running commands repeatedly for each profile is inefficient and prone to oversight, making comprehensive inventory or auditing a chore.

Manual Process Inefficiencies

Traditional approaches require manual switching between profiles and running separate commands for each account.

The Solution

Automated Profile Discovery

This Python script, powered by Boto3, automates the discovery of all your configured AWS CLI profiles. It then iterates through each profile, attempting to list its S3 buckets. The script provides a clear, profile-by-profile breakdown and includes robust error handling for issues like missing credentials or access denied for specific profiles, ensuring it runs smoothly even in complex setups.

Comprehensive Error Handling

Gracefully handles various error conditions including missing credentials, access denied, and network issues.

Features

Key Capabilities

TL;DR

  • Lists S3 buckets across ALL configured AWS CLI profiles in one go.
  • Automatically discovers available AWS profiles.
  • Provides a clear, per-profile output of S3 buckets.
  • Gracefully handles errors for individual profiles (e.g., credential issues, access denied).

Output Format

Organized display showing buckets grouped by AWS profile with clear visual separation.

Code Implementation

Dependencies and Setup

list_s3_buckets.py
import boto3
from botocore.exceptions import NoCredentialsError, ClientError

Profile Discovery Function

def get_aws_profiles():
"""Get all configured AWS profiles from the AWS CLI configuration."""
try:
session = boto3.Session()
return session.available_profiles
except Exception as e:
print(f"Error getting AWS profiles: {str(e)}")
return []

Bucket Listing Function

def list_s3_buckets_for_profile(profile_name):
"""
Lists all S3 buckets for a specific AWS profile.
Returns a list of bucket names or an empty list if an error occurs.
"""
buckets = []
try:
session = boto3.Session(profile_name=profile_name)
s3_client = session.client('s3')
response = s3_client.list_buckets()
if response['Buckets']:
for bucket in response['Buckets']:
buckets.append(bucket['Name'])
except NoCredentialsError:
print(f" Warning: No credentials found for profile '{profile_name}'. Skipping.")
except ClientError as e:
error_code = e.response.get("Error", {}).get("Code")
error_message = e.response.get("Error", {}).get("Message")
print(f" Warning: AWS Client Error for profile '{profile_name}' ({error_code}): {error_message}. Skipping.")
except Exception as e:
print(f" Warning: An unexpected error occurred for profile '{profile_name}': {e}. Skipping.")
return buckets

Main Display Function

def display_all_s3_buckets_by_profile():
"""
Fetches and displays S3 buckets for all configured AWS profiles.
"""
profiles = get_aws_profiles()
if not profiles:
print("No AWS profiles found. Please configure your AWS CLI.")
return
print(f"\nChecking S3 Buckets across {len(profiles)} AWS Profiles:")
print("-" * 40)
for profile in sorted(profiles):
print(f"\nProfile: {profile}")
print(" S3 Buckets:")
buckets = list_s3_buckets_for_profile(profile)
if buckets:
for bucket_name in buckets:
print(f" - {bucket_name}")
else:
print(" No buckets or inaccessible for this profile.")
print("-" * 40)
if not any(list_s3_buckets_for_profile(p) for p in profiles):
print("\nNo S3 buckets found across any configured profiles or all were inaccessible.")
if __name__ == "__main__":
display_all_s3_buckets_by_profile()

Benefits and Usage

Operational Advantages

Why This Helps

This script is a game-changer for anyone managing multiple AWS environments. It drastically simplifies S3 bucket auditing, inventory tasks, and compliance checks across your entire AWS footprint. Get a unified view without the manual grind, making your cloud operations more efficient and less error-prone.

Use Cases

Ideal for inventory management, compliance auditing, and multi-account AWS operations.

Community Discussion

Your S3 Management Approaches

Your Turn

How do you manage S3 bucket information in your Python projects? Any favorite Boto3 tricks for S3 you’d like to share?

Alternative Tools

Share your preferred methods for managing S3 resources across multiple AWS accounts.

Related Posts

Check out some of our other posts

Check S3 Bucket Existence

'Cloud' 'Automation' isCodeSnippet: true draft: falseQuick Tip Don’t let your deployment blow up because of a missing S3 bucket. This Bash script lets you check if a bucket exists

AWS Secrets Manager

Need to load secrets in your Node.js app without exposing them? Here's how. If you're still storing API keys or database credentials in .env files or hardcoding them into your codebase, it's ti

Essential Bash Variables for Every Script

Overview Quick Tip You know what's worse than writing scripts? Writing scripts that break every time you move them to a different machine. Let's fix that with some built-in Bash variables tha

Optimizing your python code with __slots__?

Memory Optimization with slots Understanding the Problem Dev Tip: Optimizing Data Models in Big Data Workflows with slots In big data and MLOps workflows, you often work with

Why printf Beats echo in Linux Scripts

Scripting Tip You know that feeling when a script works perfectly on your machine but fails miserably somewhere else? That's probably because you're using echo for output. Let me show you why pri