AWS S3 Integration: Encrypt Objects with PQC
Encrypt S3 objects with quantum-resistant ML-KEM in 10 minutes
🚀 Encrypt your first S3 object now
Quick Start: S3 Encryption
Estimated time: 10 minutes What you'll achieve: Upload file to S3, encrypted with AnkaSecure ML-KEM Requirements: AWS account, AnkaSecure API access
Step 1/4: Generate ML-KEM key (1 minute)
# Generate quantum-resistant key for S3 data
curl -X POST https://api.ankatech.co/keys \
-H "Authorization: Bearer $TOKEN" \
-d '{
"algorithm": "ML_KEM_1024",
"purpose": "S3_OBJECT_ENCRYPTION"
}'
✅ Result: keyId: "s3-mlkem-key-001"
Step 2/4: Encrypt file locally (3 minutes)
# Encrypt file before S3 upload
curl -X POST https://api.ankatech.co/encrypt \
-H "Authorization: Bearer $TOKEN" \
-F "keyId=s3-mlkem-key-001" \
-F "[email protected]" \
-o important-data.pdf.enc
✅ Result: important-data.pdf.enc (quantum-resistant ciphertext)
Step 3/4: Upload to S3 (2 minutes)
# Upload encrypted file to S3
aws s3 cp important-data.pdf.enc s3://mybucket/encrypted/
# Optionally: Add metadata
aws s3 cp important-data.pdf.enc s3://mybucket/encrypted/ \
--metadata encryption=ML-KEM-1024,keyId=s3-mlkem-key-001
✅ Result: File in S3, protected by quantum-resistant encryption
Security: Even if S3 bucket compromised, data is quantum-resistant
Step 4/4: Download and decrypt (4 minutes)
# Download from S3
aws s3 cp s3://mybucket/encrypted/important-data.pdf.enc .
# Decrypt with AnkaSecure
curl -X POST https://api.ankatech.co/decrypt \
-H "Authorization: Bearer $TOKEN" \
-F "keyId=s3-mlkem-key-001" \
-F "[email protected]" \
-o important-data.pdf
✅ Result: Original PDF recovered
🎯 Verified: S3 + AnkaSecure = Quantum-resistant cloud storage
What's next? - Automate with Lambda: Encrypt on S3 upload - Bulk S3 encryption: Encrypt entire bucket - Server-side integration: Java SDK for S3
Why Encrypt S3 with AnkaSecure?
vs AWS S3 Server-Side Encryption (SSE)
AWS SSE-KMS (built-in):
Pros:
✅ Automatic (no code)
✅ Integrated with S3
Cons:
❌ No PQC (RSA-only)
❌ AWS lock-in (keys in AWS KMS)
❌ Expensive ($30K/month for 10M objects)
AnkaSecure client-side:
Pros:
✅ Quantum-resistant (ML-KEM)
✅ Portable (works with any storage: S3, Azure, GCP)
✅ Cost-effective ($40K/year unlimited)
Cons:
⚠️ Requires code (encrypt before upload)
⚠️ Manual (not automatic like SSE)
When to use AnkaSecure: - Data retention > 10 years (quantum threat) - Multi-cloud strategy (not locked to AWS) - Cost optimization (> 5M objects/month) - Federal compliance (CNSA 2.0 by 2030)
Integration Patterns
Pattern 1: Lambda Auto-Encryption
Scenario: Automatically encrypt files when uploaded to S3
Architecture:
User uploads to S3 (unencrypted bucket)
↓
S3 event triggers Lambda
↓
Lambda downloads file
↓
Lambda encrypts with AnkaSecure (ML-KEM)
↓
Lambda uploads to encrypted bucket
↓
Lambda deletes original (optional)
Lambda function (Node.js):
const AWS = require('aws-sdk');
const axios = require('axios');
exports.handler = async (event) => {
const s3 = new AWS.S3();
const record = event.Records[0];
// Download original file
const object = await s3.getObject({
Bucket: record.s3.bucket.name,
Key: record.s3.object.key
}).promise();
// Encrypt with AnkaSecure
const encrypted = await axios.post('https://api.ankatech.co/encrypt', {
keyId: 's3-mlkem-key',
plaintext: object.Body.toString('base64')
}, {
headers: { 'Authorization': `Bearer ${process.env.ANKASECURE_TOKEN}` }
});
// Upload encrypted version
await s3.putObject({
Bucket: 'mybucket-encrypted',
Key: record.s3.object.key + '.enc',
Body: Buffer.from(encrypted.data.ciphertext, 'base64'),
Metadata: {
encryption: 'ML-KEM-1024',
keyId: 's3-mlkem-key',
originalName: record.s3.object.key
}
}).promise();
return { statusCode: 200 };
};
Deployment:
# Package Lambda
zip function.zip index.js node_modules/
# Create Lambda function
aws lambda create-function \
--function-name S3-AnkaSecure-Encrypt \
--runtime nodejs18.x \
--handler index.handler \
--zip-file fileb://function.zip \
--environment Variables={ANKASECURE_TOKEN=your-token}
# Add S3 trigger
aws lambda add-permission \
--function-name S3-AnkaSecure-Encrypt \
--statement-id s3-invoke \
--action lambda:InvokeFunction \
--principal s3.amazonaws.com
Result: All S3 uploads automatically quantum-encrypted
Pattern 2: Bulk S3 Bucket Encryption
Scenario: Encrypt 100,000 existing S3 objects
Script:
#!/bin/bash
# Bulk S3 encryption script
BUCKET="mybucket"
PREFIX="documents/"
# List all objects
aws s3 ls s3://$BUCKET/$PREFIX --recursive | awk '{print $4}' > files.txt
# Encrypt each file
while read key; do
# Download
aws s3 cp s3://$BUCKET/$key temp-file
# Encrypt with AnkaSecure
curl -X POST https://api.ankatech.co/encrypt \
-H "Authorization: Bearer $TOKEN" \
-F "keyId=s3-mlkem-key" \
-F "file=@temp-file" \
-o temp-file.enc
# Upload encrypted
aws s3 cp temp-file.enc s3://$BUCKET-encrypted/$key.enc
# Cleanup
rm temp-file temp-file.enc
echo "Encrypted: $key"
done < files.txt
Performance: ~100 files/minute (1KB each)
For 100K files: ~17 hours (can parallelize with GNU parallel)
Pattern 3: Java Application Integration
Scenario: Spring Boot app uploading encrypted files to S3
Dependencies (Maven):
<dependency>
<groupId>com.ankasecure</groupId>
<artifactId>ankasecure-sdk</artifactId>
<version>3.0.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-s3</artifactId>
<version>1.12.x</version>
</dependency>
Code:
@Service
public class S3EncryptionService {
private final AnkaSecureClient ankaSecure;
private final AmazonS3 s3Client;
public void uploadEncrypted(String bucketName, String key, byte[] data) {
// Encrypt with AnkaSecure (quantum-resistant!)
EncryptResponse encrypted = ankaSecure.encrypt(EncryptRequest.builder()
.algorithm("ML_KEM_1024")
.plaintext(data)
.build());
// Upload to S3
s3Client.putObject(bucketName, key + ".enc",
new ByteArrayInputStream(encrypted.getCiphertext()),
new ObjectMetadata());
log.info("Uploaded quantum-encrypted file: {}", key);
}
public byte[] downloadDecrypted(String bucketName, String key) {
// Download from S3
S3Object object = s3Client.getObject(bucketName, key);
byte[] ciphertext = IOUtils.toByteArray(object.getObjectContent());
// Decrypt with AnkaSecure
DecryptResponse decrypted = ankaSecure.decrypt(DecryptRequest.builder()
.ciphertext(ciphertext)
.build());
return decrypted.getPlaintext();
}
}
Usage:
@RestController
public class DocumentController {
@Autowired
private S3EncryptionService s3Encryption;
@PostMapping("/documents")
public ResponseEntity<String> upload(@RequestBody byte[] document) {
s3Encryption.uploadEncrypted("mybucket", "doc-" + UUID.randomUUID(), document);
return ResponseEntity.ok("Uploaded with quantum protection");
}
}
Cost Comparison
S3 SSE-KMS vs AnkaSecure
Scenario: 10M S3 objects, 1KB each, 10-year retention
AWS SSE-KMS:
Storage: 10M × 1KB = 10 GB × $0.023/GB = $0.23/month (negligible)
KMS requests: 10M PUT × $0.03/10K = $30,000/month
KMS keys: 10 CMKs × $1 = $10/month
Total: $30,010/month = $360,120/year
AnkaSecure + S3:
Storage: Same (10 GB × $0.023 = $0.23/month)
AnkaSecure: $40,000/year (on-prem, unlimited operations)
Total: $40,003/year
Savings: $320,117/year (89%)
Plus: Quantum-resistant (AWS SSE-KMS is RSA-only)
What's Next?
Integrate with S3: - 🚀 Quick start (10-minute tutorial) - 📥 Download Lambda template (Node.js, Python, Java) - 📥 Download bulk encryption script (Bash, parallelized) - 📧 Request integration assistance (free consultation)
Other cloud storage: - Azure Blob integration (coming soon) - Google Cloud Storage (coming soon)
Related topics: - Streaming encryption - Large file handling - Performance optimization - Throughput tuning
Have questions? Email [email protected]
Last updated: 2026-01-07 | Works with AWS S3, compatible with S3-compatible storage (MinIO, Ceph)